Nov 27 16:38:06 crc systemd[1]: Starting Kubernetes Kubelet... Nov 27 16:38:06 crc restorecon[4696]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:06 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:38:07 crc restorecon[4696]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 27 16:38:08 crc kubenswrapper[4954]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.390171 4954 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399693 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399734 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399743 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399754 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399763 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399774 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399784 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399795 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399806 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399817 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399851 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399860 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399869 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399889 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399897 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399905 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399913 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399921 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399928 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399936 4954 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399944 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399952 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399960 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399967 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399975 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399983 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.399991 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400000 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400007 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400016 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400027 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400036 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400044 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400052 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400061 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400068 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400076 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400085 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400093 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400100 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400108 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400116 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400124 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400131 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400139 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400146 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400157 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400166 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400173 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400181 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400189 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400197 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400206 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400214 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400222 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400230 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400241 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400252 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400260 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400269 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400276 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400284 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400297 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400310 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400319 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400327 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400336 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400343 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400352 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400360 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.400367 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400541 4954 flags.go:64] FLAG: --address="0.0.0.0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400559 4954 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400604 4954 flags.go:64] FLAG: --anonymous-auth="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400618 4954 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400630 4954 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400640 4954 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400652 4954 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400662 4954 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400672 4954 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400682 4954 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400691 4954 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400702 4954 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400712 4954 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400721 4954 flags.go:64] FLAG: --cgroup-root="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400730 4954 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400739 4954 flags.go:64] FLAG: --client-ca-file="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400749 4954 flags.go:64] FLAG: --cloud-config="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400758 4954 flags.go:64] FLAG: --cloud-provider="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400767 4954 flags.go:64] FLAG: --cluster-dns="[]" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400779 4954 flags.go:64] FLAG: --cluster-domain="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400789 4954 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400798 4954 flags.go:64] FLAG: --config-dir="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400807 4954 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400818 4954 flags.go:64] FLAG: --container-log-max-files="5" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400829 4954 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400838 4954 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400847 4954 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400857 4954 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400866 4954 flags.go:64] FLAG: --contention-profiling="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400875 4954 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400884 4954 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400894 4954 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400904 4954 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400915 4954 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400925 4954 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400934 4954 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400943 4954 flags.go:64] FLAG: --enable-load-reader="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400952 4954 flags.go:64] FLAG: --enable-server="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400961 4954 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400973 4954 flags.go:64] FLAG: --event-burst="100" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400983 4954 flags.go:64] FLAG: --event-qps="50" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.400992 4954 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401001 4954 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401009 4954 flags.go:64] FLAG: --eviction-hard="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401021 4954 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401029 4954 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401040 4954 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401050 4954 flags.go:64] FLAG: --eviction-soft="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401059 4954 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401068 4954 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401079 4954 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401088 4954 flags.go:64] FLAG: --experimental-mounter-path="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401097 4954 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401106 4954 flags.go:64] FLAG: --fail-swap-on="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401114 4954 flags.go:64] FLAG: --feature-gates="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401126 4954 flags.go:64] FLAG: --file-check-frequency="20s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401135 4954 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401144 4954 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401153 4954 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401163 4954 flags.go:64] FLAG: --healthz-port="10248" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401172 4954 flags.go:64] FLAG: --help="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401181 4954 flags.go:64] FLAG: --hostname-override="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401189 4954 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401199 4954 flags.go:64] FLAG: --http-check-frequency="20s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401208 4954 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401217 4954 flags.go:64] FLAG: --image-credential-provider-config="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401226 4954 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401235 4954 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401244 4954 flags.go:64] FLAG: --image-service-endpoint="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401253 4954 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401262 4954 flags.go:64] FLAG: --kube-api-burst="100" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401271 4954 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401281 4954 flags.go:64] FLAG: --kube-api-qps="50" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401289 4954 flags.go:64] FLAG: --kube-reserved="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401298 4954 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401307 4954 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401317 4954 flags.go:64] FLAG: --kubelet-cgroups="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401326 4954 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401335 4954 flags.go:64] FLAG: --lock-file="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401343 4954 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401352 4954 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401362 4954 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401375 4954 flags.go:64] FLAG: --log-json-split-stream="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401385 4954 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401394 4954 flags.go:64] FLAG: --log-text-split-stream="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401403 4954 flags.go:64] FLAG: --logging-format="text" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401412 4954 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401422 4954 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401432 4954 flags.go:64] FLAG: --manifest-url="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401441 4954 flags.go:64] FLAG: --manifest-url-header="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401452 4954 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401462 4954 flags.go:64] FLAG: --max-open-files="1000000" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401473 4954 flags.go:64] FLAG: --max-pods="110" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401482 4954 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401491 4954 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401500 4954 flags.go:64] FLAG: --memory-manager-policy="None" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401509 4954 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401519 4954 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401529 4954 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401539 4954 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401561 4954 flags.go:64] FLAG: --node-status-max-images="50" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401571 4954 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401608 4954 flags.go:64] FLAG: --oom-score-adj="-999" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401617 4954 flags.go:64] FLAG: --pod-cidr="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401626 4954 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401640 4954 flags.go:64] FLAG: --pod-manifest-path="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401649 4954 flags.go:64] FLAG: --pod-max-pids="-1" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401659 4954 flags.go:64] FLAG: --pods-per-core="0" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401667 4954 flags.go:64] FLAG: --port="10250" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401677 4954 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401686 4954 flags.go:64] FLAG: --provider-id="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401694 4954 flags.go:64] FLAG: --qos-reserved="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401704 4954 flags.go:64] FLAG: --read-only-port="10255" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401713 4954 flags.go:64] FLAG: --register-node="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401722 4954 flags.go:64] FLAG: --register-schedulable="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401731 4954 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401746 4954 flags.go:64] FLAG: --registry-burst="10" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401755 4954 flags.go:64] FLAG: --registry-qps="5" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401764 4954 flags.go:64] FLAG: --reserved-cpus="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401774 4954 flags.go:64] FLAG: --reserved-memory="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401785 4954 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401794 4954 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401805 4954 flags.go:64] FLAG: --rotate-certificates="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401814 4954 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401823 4954 flags.go:64] FLAG: --runonce="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401832 4954 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401841 4954 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401850 4954 flags.go:64] FLAG: --seccomp-default="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401860 4954 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401869 4954 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401878 4954 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401887 4954 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401896 4954 flags.go:64] FLAG: --storage-driver-password="root" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401905 4954 flags.go:64] FLAG: --storage-driver-secure="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401914 4954 flags.go:64] FLAG: --storage-driver-table="stats" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401923 4954 flags.go:64] FLAG: --storage-driver-user="root" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401932 4954 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401941 4954 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401951 4954 flags.go:64] FLAG: --system-cgroups="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401960 4954 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401974 4954 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401983 4954 flags.go:64] FLAG: --tls-cert-file="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.401991 4954 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402002 4954 flags.go:64] FLAG: --tls-min-version="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402011 4954 flags.go:64] FLAG: --tls-private-key-file="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402020 4954 flags.go:64] FLAG: --topology-manager-policy="none" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402028 4954 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402038 4954 flags.go:64] FLAG: --topology-manager-scope="container" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402048 4954 flags.go:64] FLAG: --v="2" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402060 4954 flags.go:64] FLAG: --version="false" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402072 4954 flags.go:64] FLAG: --vmodule="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402082 4954 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.402092 4954 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402309 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402347 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402363 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402375 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402384 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402393 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402402 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402411 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402419 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402428 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402436 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402445 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402453 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402461 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402469 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402477 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402485 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402493 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402500 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402508 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402516 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402524 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402532 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402541 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402549 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402557 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402566 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402575 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402609 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402617 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402625 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402632 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402644 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402653 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402663 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402671 4954 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402679 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402688 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402699 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402709 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402717 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402726 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402734 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402743 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402751 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402759 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402767 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402775 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402784 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402791 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402799 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402807 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402814 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402822 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402830 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402839 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402848 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402855 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402864 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402872 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402882 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402890 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402898 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402906 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402913 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402921 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402928 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402936 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402943 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402951 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.402959 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.403780 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.419157 4954 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.419214 4954 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419357 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419381 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419390 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419400 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419409 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419418 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419425 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419433 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419442 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419450 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419457 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419465 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419473 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419481 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419490 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419498 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419506 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419515 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419524 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419532 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419540 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419548 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419557 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419566 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419601 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419610 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419618 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419627 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419634 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419643 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419651 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419659 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419666 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419674 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419682 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419690 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419698 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419706 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419714 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419722 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419729 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419737 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419745 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419753 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419761 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419769 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419776 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419784 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419794 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419802 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419810 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419818 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419826 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419837 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419849 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419859 4954 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419871 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419881 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419889 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419898 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419908 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419918 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419928 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419937 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419945 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419953 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419963 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419975 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419985 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.419994 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420003 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.420017 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420284 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420298 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420308 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420317 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420326 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420335 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420343 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420352 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420361 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420369 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420378 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420386 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420394 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420402 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420435 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420445 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420455 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420465 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420473 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420482 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420490 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420500 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420511 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420521 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420530 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420538 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420546 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420556 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420565 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420603 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420616 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420626 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420638 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420684 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420699 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420708 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420716 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420724 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420732 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420741 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420749 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420757 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420766 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420774 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420782 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420790 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420799 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420807 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420814 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420823 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420833 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420844 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420855 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420864 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420873 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420882 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420890 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420899 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420908 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420916 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420925 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420935 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420943 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420953 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420962 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420970 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420978 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420986 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.420994 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.421001 4954 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.421009 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.421021 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.422234 4954 server.go:940] "Client rotation is on, will bootstrap in background" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.427801 4954 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.427958 4954 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.430622 4954 server.go:997] "Starting client certificate rotation" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.430673 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.431818 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-11 20:37:34.060785086 +0000 UTC Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.431978 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.456080 4954 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.459421 4954 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.463329 4954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.478113 4954 log.go:25] "Validated CRI v1 runtime API" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.519180 4954 log.go:25] "Validated CRI v1 image API" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.521969 4954 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.528232 4954 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-27-16-32-36-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.528277 4954 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.561950 4954 manager.go:217] Machine: {Timestamp:2025-11-27 16:38:08.558636292 +0000 UTC m=+0.576076682 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:03003ca2-7417-4e94-98d9-1cf03e475029 BootID:070a8e98-7cab-4ad3-b09c-67172438041d Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0f:89:fc Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0f:89:fc Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5b:d2:03 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1b:00:eb Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:db:20:6c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:49:97:1a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:96:b1:0a:8d:90:4b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0a:e9:c4:b3:c2:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.562392 4954 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.562706 4954 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.563237 4954 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.563534 4954 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.563624 4954 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.563990 4954 topology_manager.go:138] "Creating topology manager with none policy" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.564012 4954 container_manager_linux.go:303] "Creating device plugin manager" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.565110 4954 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.565186 4954 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.566167 4954 state_mem.go:36] "Initialized new in-memory state store" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.566608 4954 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.571486 4954 kubelet.go:418] "Attempting to sync node with API server" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.571525 4954 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.571612 4954 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.571639 4954 kubelet.go:324] "Adding apiserver pod source" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.571662 4954 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.576536 4954 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.578003 4954 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.580057 4954 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.580419 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.580613 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.580444 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.580810 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582035 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582084 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582101 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582121 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582146 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582160 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582174 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582196 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582234 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582250 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582279 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.582293 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.583400 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.584417 4954 server.go:1280] "Started kubelet" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.585520 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.585636 4954 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.585638 4954 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 27 16:38:08 crc systemd[1]: Started Kubernetes Kubelet. Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.587099 4954 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588078 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588235 4954 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588346 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:58:44.660940971 +0000 UTC Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588475 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 311h20m36.072470282s for next certificate rotation Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588445 4954 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588408 4954 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.588725 4954 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.588390 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.594519 4954 server.go:460] "Adding debug handlers to kubelet server" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.596226 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.596956 4954 factory.go:55] Registering systemd factory Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.597108 4954 factory.go:221] Registration of the systemd container factory successfully Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.597634 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.597786 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.598449 4954 factory.go:153] Registering CRI-O factory Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.598561 4954 factory.go:221] Registration of the crio container factory successfully Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.596042 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187bea79f5f89973 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:38:08.584333683 +0000 UTC m=+0.601774013,LastTimestamp:2025-11-27 16:38:08.584333683 +0000 UTC m=+0.601774013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.598962 4954 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.602301 4954 factory.go:103] Registering Raw factory Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.602353 4954 manager.go:1196] Started watching for new ooms in manager Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.604425 4954 manager.go:319] Starting recovery of all containers Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616016 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616089 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616113 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616134 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616158 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616179 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616198 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616218 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616376 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616400 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616426 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616448 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616466 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616491 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616558 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616604 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616667 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616686 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616705 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616725 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616743 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616763 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616785 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616804 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616827 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616890 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616951 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616971 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.616996 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617016 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617037 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617057 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617078 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617097 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617116 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617137 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617157 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617174 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617193 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617213 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617231 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617252 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617273 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617294 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617313 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617335 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617357 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617426 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617451 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617470 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617489 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617508 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617545 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617567 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617616 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617638 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617657 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617677 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617699 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617719 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617739 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617759 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617779 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617802 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617823 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617843 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617863 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617882 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617940 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617959 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617979 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.617997 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618016 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618036 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618055 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618074 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618095 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618115 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618136 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618158 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618226 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618246 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618264 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618284 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618304 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618325 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618408 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618463 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618485 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618520 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618541 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618560 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618652 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618681 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618701 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618720 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618742 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618761 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618780 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618809 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618829 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618859 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618901 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618922 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618952 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618974 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.618996 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619018 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619039 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619062 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619084 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619105 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.619128 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621638 4954 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621698 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621724 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621746 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621767 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621789 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621809 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621831 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621857 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621878 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621899 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621920 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621941 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621961 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.621981 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622003 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622023 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622043 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622063 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622085 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622123 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622143 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622164 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622184 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622204 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622226 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622250 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622270 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622291 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622311 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622331 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622354 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622373 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622395 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622416 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622434 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622456 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622478 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622498 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622517 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622541 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622561 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622607 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622631 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622652 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622674 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622694 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622717 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622738 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622760 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622779 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622800 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622820 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622842 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622863 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622916 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622937 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622970 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.622989 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623009 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623035 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623103 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623127 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623150 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623171 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623192 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623211 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623232 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623252 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623271 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623294 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623314 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623346 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623366 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623386 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623406 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623426 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623446 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623467 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623489 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623510 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623529 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623548 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623572 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623625 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623646 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623666 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623684 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623705 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623724 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623743 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623763 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623784 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623806 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623825 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623846 4954 reconstruct.go:97] "Volume reconstruction finished" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.623859 4954 reconciler.go:26] "Reconciler: start to sync state" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.643367 4954 manager.go:324] Recovery completed Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.657179 4954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.660132 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.660651 4954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.660756 4954 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.660806 4954 kubelet.go:2335] "Starting kubelet main sync loop" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.660910 4954 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 27 16:38:08 crc kubenswrapper[4954]: W1127 16:38:08.662149 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.662267 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.662517 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.662666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.662762 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.665123 4954 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.665167 4954 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.665208 4954 state_mem.go:36] "Initialized new in-memory state store" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.690754 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.692836 4954 policy_none.go:49] "None policy: Start" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.694042 4954 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.694096 4954 state_mem.go:35] "Initializing new in-memory state store" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.761927 4954 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.771896 4954 manager.go:334] "Starting Device Plugin manager" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.771957 4954 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.771977 4954 server.go:79] "Starting device plugin registration server" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.772517 4954 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.772547 4954 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.772818 4954 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.772994 4954 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.773017 4954 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.781376 4954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.797750 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.873118 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.874678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.874759 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.874784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.874829 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:08 crc kubenswrapper[4954]: E1127 16:38:08.875712 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.962153 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.962312 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.965567 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.965674 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.965700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.965966 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.966481 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.966559 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.967293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.967338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.967354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.967491 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968000 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968065 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968102 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968151 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968170 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968749 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968804 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.968828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969087 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969262 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969319 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.969972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.970783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.970807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.970870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.970894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.970833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.971031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.971204 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.971333 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.971386 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972458 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972524 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972598 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972805 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.972842 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.973829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.973864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:08 crc kubenswrapper[4954]: I1127 16:38:08.973881 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.031745 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.031813 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.031853 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.031938 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.031972 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032038 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032068 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032423 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.032455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.076649 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.078845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.078911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.078935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.078974 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.079549 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133778 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133880 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.133973 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134042 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134067 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134153 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134168 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134085 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134176 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134088 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134278 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134304 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134404 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134379 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134522 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134389 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134674 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134863 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134774 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.134980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.135053 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.135079 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.135087 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.135141 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.135220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.199460 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.312320 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.337804 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.347002 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.364235 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4edcbae9fd431a59a4e2adce4d0d640ef970a191459273d9d7717c2131881b7b WatchSource:0}: Error finding container 4edcbae9fd431a59a4e2adce4d0d640ef970a191459273d9d7717c2131881b7b: Status 404 returned error can't find the container with id 4edcbae9fd431a59a4e2adce4d0d640ef970a191459273d9d7717c2131881b7b Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.369373 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.375735 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.388611 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.388781 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.388855 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-64954bd155906738795199e4fbdcacfc1f9c21c14eafdf6a37089a5cf0be3b06 WatchSource:0}: Error finding container 64954bd155906738795199e4fbdcacfc1f9c21c14eafdf6a37089a5cf0be3b06: Status 404 returned error can't find the container with id 64954bd155906738795199e4fbdcacfc1f9c21c14eafdf6a37089a5cf0be3b06 Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.394450 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-9e8f258b8f508a461f9ab8b72610aac04728b05ff400db78812792f7501e7007 WatchSource:0}: Error finding container 9e8f258b8f508a461f9ab8b72610aac04728b05ff400db78812792f7501e7007: Status 404 returned error can't find the container with id 9e8f258b8f508a461f9ab8b72610aac04728b05ff400db78812792f7501e7007 Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.401632 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c34db4501e3ae2ea08841bc5e8075a24614f71f381c107798baeae0c1cb1f14f WatchSource:0}: Error finding container c34db4501e3ae2ea08841bc5e8075a24614f71f381c107798baeae0c1cb1f14f: Status 404 returned error can't find the container with id c34db4501e3ae2ea08841bc5e8075a24614f71f381c107798baeae0c1cb1f14f Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.480094 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.482058 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.482141 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.482163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.482213 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.483105 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.496547 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.496696 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.587447 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.668362 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e8f258b8f508a461f9ab8b72610aac04728b05ff400db78812792f7501e7007"} Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.669982 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64954bd155906738795199e4fbdcacfc1f9c21c14eafdf6a37089a5cf0be3b06"} Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.671603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"57e10117f222f6539be9e0bf3e61fb6447d3505e8af90e87ebe128ed687e2faa"} Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.673491 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4edcbae9fd431a59a4e2adce4d0d640ef970a191459273d9d7717c2131881b7b"} Nov 27 16:38:09 crc kubenswrapper[4954]: I1127 16:38:09.674941 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c34db4501e3ae2ea08841bc5e8075a24614f71f381c107798baeae0c1cb1f14f"} Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.760102 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.760228 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:09 crc kubenswrapper[4954]: W1127 16:38:09.935380 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:09 crc kubenswrapper[4954]: E1127 16:38:09.935481 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:10 crc kubenswrapper[4954]: E1127 16:38:10.001380 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.283733 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.286549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.286650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.286711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.286758 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:10 crc kubenswrapper[4954]: E1127 16:38:10.287257 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.576251 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 27 16:38:10 crc kubenswrapper[4954]: E1127 16:38:10.578310 4954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.587349 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.681385 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71" exitCode=0 Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.681515 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.681601 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71"} Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.682756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.682816 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.682837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.685500 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.685677 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c6b0e9735b6fd4369b346da75538210db64d7c8a55caddc45caf8ee1afefdaed" exitCode=0 Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.685805 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c6b0e9735b6fd4369b346da75538210db64d7c8a55caddc45caf8ee1afefdaed"} Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.685870 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.686773 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.686813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.686831 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.688303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.688374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.688398 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.689270 4954 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918" exitCode=0 Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.689385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918"} Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.689438 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.691298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.691349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.691371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.694220 4954 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2" exitCode=0 Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.694320 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.694354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2"} Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.695284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.695319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.695337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.698632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08"} Nov 27 16:38:10 crc kubenswrapper[4954]: I1127 16:38:10.698723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.587181 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:11 crc kubenswrapper[4954]: E1127 16:38:11.602457 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.705879 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.705957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.705992 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.707341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.707409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.707427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.709492 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.709539 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.709551 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.709564 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.711978 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0efb6c01ffae194cdaefc2f530578ff62c518ac316d380ba140ed9022d16c4d2" exitCode=0 Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.712042 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0efb6c01ffae194cdaefc2f530578ff62c518ac316d380ba140ed9022d16c4d2"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.712115 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.712930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.713031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.713053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.715384 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"589ee698e003ae1938fae963deb0288be15549fc6efd55fb72e0d40ee3ca325d"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.715444 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.716317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.716374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.716390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.723768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.723821 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.723835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885"} Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.723944 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.725306 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.725352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.725365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.888224 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.889657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.889701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.889711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:11 crc kubenswrapper[4954]: I1127 16:38:11.889739 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:11 crc kubenswrapper[4954]: E1127 16:38:11.890145 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 16:38:12 crc kubenswrapper[4954]: W1127 16:38:12.114245 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 16:38:12 crc kubenswrapper[4954]: E1127 16:38:12.114352 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.732464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a"} Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.732665 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.734520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.734604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.734626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737144 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5fab823058e85eb94204e6fa329fc8e36c641692effe6b2f3b839603b20469d9" exitCode=0 Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737331 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737410 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737457 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5fab823058e85eb94204e6fa329fc8e36c641692effe6b2f3b839603b20469d9"} Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737339 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.737339 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.740057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.740094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:12 crc kubenswrapper[4954]: I1127 16:38:12.739992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.174929 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.184928 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.659017 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.747570 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c64e6b878f3b3fc07f3d1da583ab58275128f6245af3a210ceb1c26f1a1ff5be"} Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.747635 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2203b7bdc1c2088da728c611b5ef1b09f941858d234c5ba743263ff28bba1163"} Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.747647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0760dbd24b54c98ec2ebbf49620cde29dd66ace993d3068fd0bfbb0ca756e5be"} Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.747703 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.747764 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.748236 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.748272 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.748558 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.748640 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.748664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.749508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.749537 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:13 crc kubenswrapper[4954]: I1127 16:38:13.749547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.760224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"490d3c30e50e33bb03150cd49a396d7390dbc82c84abe5d2f0df358bc049fa24"} Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.760361 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.760380 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.760363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f757aaf68effbbd8832b09df680a327ee78bd981cde5f9fb8b718bd1c7875c2"} Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.761096 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762210 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762250 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762778 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.762840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:14 crc kubenswrapper[4954]: I1127 16:38:14.973697 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.090352 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.092093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.092149 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.092167 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.092203 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.278967 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.764477 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.764689 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:15 crc kubenswrapper[4954]: I1127 16:38:15.766220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.300917 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.301208 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.303868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.303939 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.303958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.659367 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:38:16 crc kubenswrapper[4954]: I1127 16:38:16.659480 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.010287 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.010659 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.012531 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.012625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.012645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.309825 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.310095 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.312415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.312475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:17 crc kubenswrapper[4954]: I1127 16:38:17.312492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:18 crc kubenswrapper[4954]: I1127 16:38:18.345566 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 27 16:38:18 crc kubenswrapper[4954]: I1127 16:38:18.345846 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:18 crc kubenswrapper[4954]: I1127 16:38:18.347474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:18 crc kubenswrapper[4954]: I1127 16:38:18.347540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:18 crc kubenswrapper[4954]: I1127 16:38:18.347552 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:18 crc kubenswrapper[4954]: E1127 16:38:18.781826 4954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 16:38:20 crc kubenswrapper[4954]: I1127 16:38:20.230365 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:20 crc kubenswrapper[4954]: I1127 16:38:20.230875 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:20 crc kubenswrapper[4954]: I1127 16:38:20.232624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:20 crc kubenswrapper[4954]: I1127 16:38:20.232691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:20 crc kubenswrapper[4954]: I1127 16:38:20.232716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:22 crc kubenswrapper[4954]: W1127 16:38:22.259456 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.259654 4954 trace.go:236] Trace[882522407]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:38:12.258) (total time: 10001ms): Nov 27 16:38:22 crc kubenswrapper[4954]: Trace[882522407]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:38:22.259) Nov 27 16:38:22 crc kubenswrapper[4954]: Trace[882522407]: [10.001462014s] [10.001462014s] END Nov 27 16:38:22 crc kubenswrapper[4954]: E1127 16:38:22.259702 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.482825 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43886->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.482941 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43886->192.168.126.11:17697: read: connection reset by peer" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.587462 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:38:22 crc kubenswrapper[4954]: W1127 16:38:22.715088 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.715228 4954 trace.go:236] Trace[1028518891]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:38:12.713) (total time: 10001ms): Nov 27 16:38:22 crc kubenswrapper[4954]: Trace[1028518891]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:38:22.715) Nov 27 16:38:22 crc kubenswrapper[4954]: Trace[1028518891]: [10.001650448s] [10.001650448s] END Nov 27 16:38:22 crc kubenswrapper[4954]: E1127 16:38:22.715265 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.789388 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.791277 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a" exitCode=255 Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.791365 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a"} Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.791647 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.792779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.792828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.792846 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:22 crc kubenswrapper[4954]: I1127 16:38:22.793556 4954 scope.go:117] "RemoveContainer" containerID="1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a" Nov 27 16:38:23 crc kubenswrapper[4954]: W1127 16:38:23.076473 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.076612 4954 trace.go:236] Trace[1019562173]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:38:13.075) (total time: 10001ms): Nov 27 16:38:23 crc kubenswrapper[4954]: Trace[1019562173]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:38:23.076) Nov 27 16:38:23 crc kubenswrapper[4954]: Trace[1019562173]: [10.001480176s] [10.001480176s] END Nov 27 16:38:23 crc kubenswrapper[4954]: E1127 16:38:23.076643 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.135596 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.135677 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.158481 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.158542 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.797304 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.800080 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583"} Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.800311 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.801624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.801682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.801696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.964455 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.964671 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.966049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.966116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:23 crc kubenswrapper[4954]: I1127 16:38:23.966135 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.019756 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.803401 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.805638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.805717 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.805741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:24 crc kubenswrapper[4954]: I1127 16:38:24.824429 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 27 16:38:25 crc kubenswrapper[4954]: I1127 16:38:25.807036 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:25 crc kubenswrapper[4954]: I1127 16:38:25.808722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:25 crc kubenswrapper[4954]: I1127 16:38:25.808782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:25 crc kubenswrapper[4954]: I1127 16:38:25.808804 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.308840 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.309064 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.309171 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.310656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.310707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.310718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.315331 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.660129 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.660259 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.810551 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.812307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.812355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:26 crc kubenswrapper[4954]: I1127 16:38:26.812386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.319134 4954 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.582416 4954 apiserver.go:52] "Watching apiserver" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.589301 4954 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.589639 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590206 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590329 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:27 crc kubenswrapper[4954]: E1127 16:38:27.590405 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590419 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590691 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:27 crc kubenswrapper[4954]: E1127 16:38:27.590824 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.590957 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:27 crc kubenswrapper[4954]: E1127 16:38:27.591021 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.593219 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.593803 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.594196 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.594505 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.594637 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.594865 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.595073 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.595100 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.595760 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.628421 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.641993 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.653338 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.666886 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.681244 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.689888 4954 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.696071 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.707975 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.718102 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.734752 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.805410 4954 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 16:38:27 crc kubenswrapper[4954]: I1127 16:38:27.833056 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.120799 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.126065 4954 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.126351 4954 trace.go:236] Trace[378144284]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:38:17.301) (total time: 10824ms): Nov 27 16:38:28 crc kubenswrapper[4954]: Trace[378144284]: ---"Objects listed" error: 10824ms (16:38:28.125) Nov 27 16:38:28 crc kubenswrapper[4954]: Trace[378144284]: [10.824627741s] [10.824627741s] END Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.126405 4954 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.126352 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.134472 4954 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.226914 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.226967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227111 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227137 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227156 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227209 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227236 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227255 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227273 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227314 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227332 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227379 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227404 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227391 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227467 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227488 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227515 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227535 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227552 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227568 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227629 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227647 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227667 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227673 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227697 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227724 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227744 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.227787 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:28.72775793 +0000 UTC m=+20.745198240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227821 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227865 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227891 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227947 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.227974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228001 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228049 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228056 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228078 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228108 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228132 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228184 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228232 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228275 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228282 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228300 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228327 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228351 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228374 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228396 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228420 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228452 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228468 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228619 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228675 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228668 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228743 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228813 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228847 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228896 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228911 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.228988 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229028 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229141 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229188 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229216 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229229 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229300 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229419 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229427 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229473 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229541 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229613 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229652 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229822 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229932 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230000 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230073 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230194 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230265 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230304 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230549 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230636 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230746 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230826 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230866 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230934 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231475 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231515 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231651 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231697 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231766 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231928 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.232038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.232116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.232820 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.232988 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233030 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233066 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233104 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233144 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233181 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233216 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233253 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233294 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233334 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233440 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233476 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233512 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233547 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233606 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233644 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233684 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233719 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233754 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233787 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233826 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233864 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233900 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233973 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234011 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.242518 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.242962 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243000 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243062 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243226 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243286 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243320 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243344 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243396 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243420 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243447 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243692 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243829 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243896 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243938 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243963 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244148 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244181 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244205 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244275 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244467 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244498 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244546 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244592 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244641 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244689 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244733 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244859 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244892 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244916 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245002 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245070 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245115 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245184 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245234 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245278 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245314 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245393 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245892 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245929 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229479 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251785 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229682 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229725 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.229934 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230092 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230124 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230161 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230307 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230500 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230712 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.230895 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231425 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231507 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.231715 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252481 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233220 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233549 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233611 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233746 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233858 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.233956 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234281 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234389 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234558 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234725 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.234884 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252731 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.235132 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.235054 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.235684 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.235783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.235998 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236090 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236171 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236319 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236468 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236620 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236855 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.236938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237066 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237119 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237133 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237343 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237471 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237682 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237885 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.237984 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238680 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238713 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238814 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238939 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238963 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.238987 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.242793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243178 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243260 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243889 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.243990 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244453 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244514 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244539 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244725 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.244820 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245213 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245249 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245478 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245728 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.245796 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.246639 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.246653 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.246938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247059 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247419 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247449 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247724 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247819 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.247911 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.248354 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.248800 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249084 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249192 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249284 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249615 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249651 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.249990 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.250058 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.250244 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.250374 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.250903 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251255 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251288 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251444 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251636 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251750 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.251937 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252358 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252822 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252733 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252850 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252907 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.252953 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.253113 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.253500 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.253556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.253862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254160 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254649 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254742 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254775 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254805 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254828 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254852 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254884 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.254967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255003 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255015 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255037 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255066 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255091 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255173 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255310 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255417 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255765 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.255907 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256095 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256146 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256196 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256235 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256278 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256335 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256370 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256237 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256398 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256641 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256711 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256746 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256781 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256810 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256849 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256874 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256899 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256926 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.256981 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257012 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257040 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257069 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257169 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257255 4954 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257268 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257280 4954 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257291 4954 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257301 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257311 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257323 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257333 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257344 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257356 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257367 4954 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257378 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257388 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257400 4954 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257409 4954 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257420 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257431 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257441 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257451 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257460 4954 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257469 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257479 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257492 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257505 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257517 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257528 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257540 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257551 4954 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257562 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257590 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257603 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257615 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257627 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257636 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257648 4954 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257658 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257667 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257677 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257686 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257697 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257706 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257715 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257724 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257736 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257752 4954 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257763 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257775 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.258279 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.258389 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.258706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.258962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.257786 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259194 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259205 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259215 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259224 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259233 4954 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259241 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259251 4954 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259260 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259268 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259277 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259285 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259294 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259302 4954 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259311 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259320 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259328 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259337 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259347 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259357 4954 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259368 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259377 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259359 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259390 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259473 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259512 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259562 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259635 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259674 4954 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259706 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259802 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259822 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259860 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259891 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259922 4954 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259951 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.259981 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260010 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260040 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260033 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260071 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260105 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260135 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260166 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260196 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260227 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260257 4954 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260288 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260318 4954 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260354 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260384 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260413 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260442 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260473 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260503 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260538 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260555 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260570 4954 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260642 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260671 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260701 4954 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260730 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260759 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260788 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260817 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260845 4954 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260873 4954 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260905 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260934 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260961 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.260990 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261019 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261046 4954 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261078 4954 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261109 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261141 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261172 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261202 4954 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261231 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261261 4954 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261290 4954 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261319 4954 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261350 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261380 4954 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261409 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261439 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261469 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261498 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261528 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.261618 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.263449 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.263492 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.263519 4954 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.263693 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.264038 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.267184 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.267290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.267321 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:28.767283639 +0000 UTC m=+20.784723969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.267735 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.267951 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.268611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.268795 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.269013 4954 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.269320 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.269339 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.270957 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.270951 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.271266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.270951 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.271069 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.271076 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.271437 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.271531 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:28.771496228 +0000 UTC m=+20.788936538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.271699 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.271959 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.272474 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.272574 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.276065 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.276060 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.277263 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.285375 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.285633 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.285727 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.286056 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.286138 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.286150 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.286165 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.286228 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:28.786204684 +0000 UTC m=+20.803644984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.287411 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.287507 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.287879 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.288967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.289475 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.289673 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290084 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290338 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290406 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290446 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290458 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.290509 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.291700 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.292100 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.292148 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.292373 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.292482 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.292571 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.292745 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:28.792727687 +0000 UTC m=+20.810167987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.292965 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293212 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293267 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293430 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293553 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.295230 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.296468 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.293562 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.299903 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.300955 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.310082 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.315498 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.317016 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.334522 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.336527 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.362300 4954 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364746 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364787 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364832 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364843 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364854 4954 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364864 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364874 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364884 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364893 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364902 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364911 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364920 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364928 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364937 4954 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364945 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364955 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364964 4954 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364972 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364981 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364958 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.364992 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365090 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365116 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365148 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365162 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365175 4954 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365185 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365195 4954 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365207 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365232 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365242 4954 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365252 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365264 4954 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365274 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365302 4954 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365313 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365324 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365333 4954 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365343 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365354 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365363 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365387 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365397 4954 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365407 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365417 4954 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365427 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365437 4954 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365462 4954 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365471 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365481 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365491 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365500 4954 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365512 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365534 4954 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365544 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365553 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365564 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365604 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365616 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365627 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.365638 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.508099 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.517031 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.524406 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:38:28 crc kubenswrapper[4954]: W1127 16:38:28.524789 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-220f0c4b09622e2f5f9131e6d346d91e8006f835e78325d06aac129973860fa5 WatchSource:0}: Error finding container 220f0c4b09622e2f5f9131e6d346d91e8006f835e78325d06aac129973860fa5: Status 404 returned error can't find the container with id 220f0c4b09622e2f5f9131e6d346d91e8006f835e78325d06aac129973860fa5 Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.666637 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.667221 4954 csr.go:261] certificate signing request csr-vbn2k is approved, waiting to be issued Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.667794 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.669462 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.670381 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.671821 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.672573 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.673384 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.674747 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.675722 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.676510 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.676864 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.677424 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.679509 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.680076 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.680655 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.682160 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.682795 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.683766 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.684194 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.684818 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.686374 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.686883 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.688047 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.688548 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.689950 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.690389 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.692757 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.693989 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.694513 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.694551 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.695752 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.696226 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.696714 4954 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.696821 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.700472 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.701205 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.702232 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.704002 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.704757 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.705689 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.706327 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.707451 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.707948 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.708609 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.710262 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.711393 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.711940 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.713034 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.713655 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.713738 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.715200 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.716025 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.716550 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.717042 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.718028 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.718720 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.719666 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.722655 4954 csr.go:257] certificate signing request csr-vbn2k is issued Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.748136 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.769042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.769182 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.769342 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.769415 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:29.76939419 +0000 UTC m=+21.786834490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.769533 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:29.769524633 +0000 UTC m=+21.786964933 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.781912 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.797292 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-27v67"] Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.797811 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.801357 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.801656 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.801803 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.826920 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.840271 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"220f0c4b09622e2f5f9131e6d346d91e8006f835e78325d06aac129973860fa5"} Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.846042 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc"} Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.846108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"21629670fbb8e5917d82c6833369d78fa7f37c24d201167bdafa189975cda51c"} Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.851355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"12e66660a5cdd3e16bad0d8ecce0f788a34db9bdf217ea2936096c7bb5e14852"} Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.866031 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.870202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5df79f3c-9df0-48a0-980f-10ecadf5efd5-hosts-file\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.870244 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.870270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.870300 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn2f2\" (UniqueName: \"kubernetes.io/projected/5df79f3c-9df0-48a0-980f-10ecadf5efd5-kube-api-access-qn2f2\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.870319 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870420 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870434 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870447 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870488 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:29.870472535 +0000 UTC m=+21.887912835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870541 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870551 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870559 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870597 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:29.870591138 +0000 UTC m=+21.888031438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870631 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: E1127 16:38:28.870652 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:29.870645889 +0000 UTC m=+21.888086189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.910355 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.932018 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.956639 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.971187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5df79f3c-9df0-48a0-980f-10ecadf5efd5-hosts-file\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.971337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn2f2\" (UniqueName: \"kubernetes.io/projected/5df79f3c-9df0-48a0-980f-10ecadf5efd5-kube-api-access-qn2f2\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.971281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5df79f3c-9df0-48a0-980f-10ecadf5efd5-hosts-file\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.982761 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:28 crc kubenswrapper[4954]: I1127 16:38:28.995379 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn2f2\" (UniqueName: \"kubernetes.io/projected/5df79f3c-9df0-48a0-980f-10ecadf5efd5-kube-api-access-qn2f2\") pod \"node-resolver-27v67\" (UID: \"5df79f3c-9df0-48a0-980f-10ecadf5efd5\") " pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.005119 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.017354 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.027549 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.037493 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.128245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-27v67" Nov 27 16:38:29 crc kubenswrapper[4954]: W1127 16:38:29.141326 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df79f3c_9df0_48a0_980f_10ecadf5efd5.slice/crio-92b83b63a7aca162deac2d7617c546640d515270531a37db63eb3a2f5f91663d WatchSource:0}: Error finding container 92b83b63a7aca162deac2d7617c546640d515270531a37db63eb3a2f5f91663d: Status 404 returned error can't find the container with id 92b83b63a7aca162deac2d7617c546640d515270531a37db63eb3a2f5f91663d Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.661239 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.661288 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.661488 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.661669 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.661680 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.661760 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.677682 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-699qq"] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.678216 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.678750 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9mb96"] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.679195 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.680998 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.681129 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.680995 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.681343 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.682824 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.683417 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cz8gx"] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.683426 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.683563 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.683488 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.684254 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.685272 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.689275 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.689528 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.689644 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.706301 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.723458 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-27 16:33:28 +0000 UTC, rotation deadline is 2026-09-05 10:56:19.727213446 +0000 UTC Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.723629 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6762h17m50.003589991s for next certificate rotation Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.725203 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.742645 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779293 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-system-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.779392 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:31.779354555 +0000 UTC m=+23.796794855 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779458 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-kubelet\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779572 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779649 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-netns\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779668 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-multus-certs\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779694 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/33a80574-7c60-4f19-985b-3ee313cb7bcd-rootfs\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779712 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cni-binary-copy\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779733 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-hostroot\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779753 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-k8s-cni-cncf-io\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779805 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzf4\" (UniqueName: \"kubernetes.io/projected/536fc833-8add-426d-9ed0-b63547d316e0-kube-api-access-5bzf4\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779830 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a80574-7c60-4f19-985b-3ee313cb7bcd-proxy-tls\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779850 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzjp\" (UniqueName: \"kubernetes.io/projected/33a80574-7c60-4f19-985b-3ee313cb7bcd-kube-api-access-kwzjp\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-conf-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779909 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-socket-dir-parent\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-multus\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-daemon-config\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.779986 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96jj\" (UniqueName: \"kubernetes.io/projected/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-kube-api-access-r96jj\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cnibin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780027 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-os-release\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-etc-kubernetes\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-bin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780091 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-cnibin\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-system-cni-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780134 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-os-release\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780152 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780170 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.780200 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33a80574-7c60-4f19-985b-3ee313cb7bcd-mcd-auth-proxy-config\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.780384 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.780448 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:31.780436911 +0000 UTC m=+23.797877211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.787798 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.812937 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.838227 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.856135 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716"} Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.857622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-27v67" event={"ID":"5df79f3c-9df0-48a0-980f-10ecadf5efd5","Type":"ContainerStarted","Data":"80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd"} Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.857686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-27v67" event={"ID":"5df79f3c-9df0-48a0-980f-10ecadf5efd5","Type":"ContainerStarted","Data":"92b83b63a7aca162deac2d7617c546640d515270531a37db63eb3a2f5f91663d"} Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.859472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428"} Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.864690 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.877943 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881029 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-multus-certs\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881083 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-netns\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/33a80574-7c60-4f19-985b-3ee313cb7bcd-rootfs\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881163 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cni-binary-copy\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-multus-certs\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-hostroot\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881231 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-hostroot\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.881382 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.881430 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.881448 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881391 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-netns\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.881513 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:31.881492406 +0000 UTC m=+23.898932706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881728 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/33a80574-7c60-4f19-985b-3ee313cb7bcd-rootfs\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a80574-7c60-4f19-985b-3ee313cb7bcd-proxy-tls\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-k8s-cni-cncf-io\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzf4\" (UniqueName: \"kubernetes.io/projected/536fc833-8add-426d-9ed0-b63547d316e0-kube-api-access-5bzf4\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzjp\" (UniqueName: \"kubernetes.io/projected/33a80574-7c60-4f19-985b-3ee313cb7bcd-kube-api-access-kwzjp\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882168 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-conf-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882239 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-socket-dir-parent\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882404 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-multus\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cnibin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.881787 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882537 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-daemon-config\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882629 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96jj\" (UniqueName: \"kubernetes.io/projected/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-kube-api-access-r96jj\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882658 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-os-release\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882686 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-bin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882711 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-etc-kubernetes\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-system-cni-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882761 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-cnibin\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882796 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33a80574-7c60-4f19-985b-3ee313cb7bcd-mcd-auth-proxy-config\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882846 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-os-release\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882870 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882898 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882925 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-system-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882977 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-kubelet\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882985 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-run-k8s-cni-cncf-io\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883049 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-kubelet\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883090 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-etc-kubernetes\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883125 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-system-cni-dir\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-cnibin\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.883209 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.883263 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:31.883248307 +0000 UTC m=+23.900688627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883933 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-daemon-config\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.883997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/536fc833-8add-426d-9ed0-b63547d316e0-os-release\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884041 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33a80574-7c60-4f19-985b-3ee313cb7bcd-mcd-auth-proxy-config\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882940 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-bin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884242 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-os-release\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884090 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-system-cni-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.882247 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cni-binary-copy\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-conf-dir\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.884395 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.884420 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.884437 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:29 crc kubenswrapper[4954]: E1127 16:38:29.884475 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:31.884464145 +0000 UTC m=+23.901904465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884751 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-host-var-lib-cni-multus\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884858 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-multus-socket-dir-parent\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.884900 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-cnibin\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.885137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/536fc833-8add-426d-9ed0-b63547d316e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.888157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a80574-7c60-4f19-985b-3ee313cb7bcd-proxy-tls\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.892797 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.899886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96jj\" (UniqueName: \"kubernetes.io/projected/c5bda3ef-ba2c-424a-ba4a-432053d1c40d-kube-api-access-r96jj\") pod \"multus-9mb96\" (UID: \"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\") " pod="openshift-multus/multus-9mb96" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.902304 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzf4\" (UniqueName: \"kubernetes.io/projected/536fc833-8add-426d-9ed0-b63547d316e0-kube-api-access-5bzf4\") pod \"multus-additional-cni-plugins-cz8gx\" (UID: \"536fc833-8add-426d-9ed0-b63547d316e0\") " pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.906036 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzjp\" (UniqueName: \"kubernetes.io/projected/33a80574-7c60-4f19-985b-3ee313cb7bcd-kube-api-access-kwzjp\") pod \"machine-config-daemon-699qq\" (UID: \"33a80574-7c60-4f19-985b-3ee313cb7bcd\") " pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.908091 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.930561 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.947045 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.960604 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.979708 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.995453 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:29 crc kubenswrapper[4954]: I1127 16:38:29.996702 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.005387 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9mb96" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.013811 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.013850 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.031816 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.051219 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.077750 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.097560 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.106295 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5zbp"] Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.107420 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.110717 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.116748 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.116865 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.117347 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.117388 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.117512 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.119247 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.133416 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.154104 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.168635 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185265 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185289 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185327 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185353 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185374 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185397 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185647 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185702 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hxv\" (UniqueName: \"kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185773 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185795 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185819 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185840 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185868 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185903 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185929 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.185953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.188991 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.206569 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.224694 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.248163 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.268847 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287282 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287347 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287397 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287416 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287434 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287475 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287494 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27hxv\" (UniqueName: \"kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287548 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287568 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287642 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287661 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287676 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287698 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287739 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287810 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287852 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.287952 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.288038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.288178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.288207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.288493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289010 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289132 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289186 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289564 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.289788 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.290160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.293510 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.297334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.309700 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hxv\" (UniqueName: \"kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv\") pod \"ovnkube-node-d5zbp\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.319792 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.335906 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.354267 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.481649 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:30 crc kubenswrapper[4954]: W1127 16:38:30.503081 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c365fc_0cba_4fcf_b721_30de2b908a56.slice/crio-afa426086e9bfd2dbd7ad9acc345a36a4cbd56b5b0ee0a2397298f86ce0d7d69 WatchSource:0}: Error finding container afa426086e9bfd2dbd7ad9acc345a36a4cbd56b5b0ee0a2397298f86ce0d7d69: Status 404 returned error can't find the container with id afa426086e9bfd2dbd7ad9acc345a36a4cbd56b5b0ee0a2397298f86ce0d7d69 Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.863679 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerStarted","Data":"3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.863739 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerStarted","Data":"5eb7b26ad3883f298eff86cd403ae002a3f96f115821be80cccf0de38fdcaaea"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.869542 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" exitCode=0 Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.869615 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.869670 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"afa426086e9bfd2dbd7ad9acc345a36a4cbd56b5b0ee0a2397298f86ce0d7d69"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.873062 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c" exitCode=0 Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.873126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.873146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerStarted","Data":"124ca2b4f8718a71716dacf93c99f01f9bd90ec3568837f15ebfedff318157e1"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.876291 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.876318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.876329 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"a06306a936dd1a0b72ec41a2e5ea2b6eab992d2354fb5b777bb3584ad83dcd9a"} Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.880543 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.897528 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.922062 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.938600 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:30 crc kubenswrapper[4954]: I1127 16:38:30.982230 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:30Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.014821 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.048219 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.083537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.104549 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.126819 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.139106 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.154753 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.170137 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.192128 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.211785 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.242953 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.262857 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.285355 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.297151 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.312325 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.334008 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.351025 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.369464 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.394599 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.662053 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.662757 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.662174 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.662845 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.662099 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.662906 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.674073 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lt9bl"] Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.674592 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.676628 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.676771 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.677206 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.683823 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.702172 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.717998 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.733036 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.749867 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.766307 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.778052 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.796989 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.805549 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.805697 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56lz\" (UniqueName: \"kubernetes.io/projected/8f164460-f6b2-4383-9e5e-f4d0045d9690-kube-api-access-b56lz\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.805759 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f164460-f6b2-4383-9e5e-f4d0045d9690-host\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.805787 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.805854 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:35.80579603 +0000 UTC m=+27.823236330 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.805909 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.805965 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:35.805951063 +0000 UTC m=+27.823391363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.805985 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f164460-f6b2-4383-9e5e-f4d0045d9690-serviceca\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.811322 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.821933 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.839598 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.854365 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.865237 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.881165 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.884144 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f" exitCode=0 Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.884215 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.886566 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.893264 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.893322 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.893335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.893346 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.899725 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906826 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906876 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f164460-f6b2-4383-9e5e-f4d0045d9690-host\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906912 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906933 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f164460-f6b2-4383-9e5e-f4d0045d9690-serviceca\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906956 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56lz\" (UniqueName: \"kubernetes.io/projected/8f164460-f6b2-4383-9e5e-f4d0045d9690-kube-api-access-b56lz\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.906978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907103 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907129 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907144 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.907142 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f164460-f6b2-4383-9e5e-f4d0045d9690-host\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907209 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:35.907187803 +0000 UTC m=+27.924628103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907340 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907371 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907383 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907432 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907444 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:35.907427539 +0000 UTC m=+27.924867839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:31 crc kubenswrapper[4954]: E1127 16:38:31.907466 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:35.907455359 +0000 UTC m=+27.924895659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.909060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f164460-f6b2-4383-9e5e-f4d0045d9690-serviceca\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.913431 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.929053 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.939000 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56lz\" (UniqueName: \"kubernetes.io/projected/8f164460-f6b2-4383-9e5e-f4d0045d9690-kube-api-access-b56lz\") pod \"node-ca-lt9bl\" (UID: \"8f164460-f6b2-4383-9e5e-f4d0045d9690\") " pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.944966 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.964493 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.980412 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.993426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lt9bl" Nov 27 16:38:31 crc kubenswrapper[4954]: I1127 16:38:31.998042 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:31Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.013706 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: W1127 16:38:32.017165 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f164460_f6b2_4383_9e5e_f4d0045d9690.slice/crio-9945c38466f2b2588e9cd85c84c1bea886ecda9e6dea184e219945823e12061d WatchSource:0}: Error finding container 9945c38466f2b2588e9cd85c84c1bea886ecda9e6dea184e219945823e12061d: Status 404 returned error can't find the container with id 9945c38466f2b2588e9cd85c84c1bea886ecda9e6dea184e219945823e12061d Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.029525 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.045405 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.059240 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.071791 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.092296 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.115755 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.134513 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.149628 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.164454 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.176832 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.189174 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.208746 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.223658 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.235759 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.250605 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.268451 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.287462 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.303145 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.898140 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lt9bl" event={"ID":"8f164460-f6b2-4383-9e5e-f4d0045d9690","Type":"ContainerStarted","Data":"cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5"} Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.898559 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lt9bl" event={"ID":"8f164460-f6b2-4383-9e5e-f4d0045d9690","Type":"ContainerStarted","Data":"9945c38466f2b2588e9cd85c84c1bea886ecda9e6dea184e219945823e12061d"} Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.903231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.903274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.906983 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623" exitCode=0 Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.907064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623"} Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.924795 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.950697 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.970992 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:32 crc kubenswrapper[4954]: I1127 16:38:32.986751 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.001367 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.015664 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.033755 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.047324 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.073273 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.087683 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.100689 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.111377 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.124877 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.141667 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.159809 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.172790 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.185934 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.199806 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.213287 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.228536 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.242299 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.254619 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.284074 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.297259 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.309743 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.321278 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.661603 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.661687 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:33 crc kubenswrapper[4954]: E1127 16:38:33.661786 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:33 crc kubenswrapper[4954]: E1127 16:38:33.661960 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.661629 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:33 crc kubenswrapper[4954]: E1127 16:38:33.662262 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.665105 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.669118 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.674100 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.682312 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.693962 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.708917 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.723511 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.735613 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.752807 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.769605 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.783123 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.796970 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.813486 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.829413 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.843501 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.863313 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.876391 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.886501 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.897475 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.909683 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.914429 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988" exitCode=0 Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.914496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988"} Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.929814 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.943731 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.965419 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.976951 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:33 crc kubenswrapper[4954]: I1127 16:38:33.993612 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.029055 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.073492 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.109979 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.148484 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.200778 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.236643 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.267677 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.315656 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.350404 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.389045 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.429062 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.467893 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.509517 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.526772 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.529254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.529297 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.529308 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.529446 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.552432 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.602156 4954 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.602546 4954 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.603936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.603959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.603968 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.603983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.603993 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.617247 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.622337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.622387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.622403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.622424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.622437 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.632010 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.640556 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.645186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.645220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.645233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.645251 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.645264 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.660176 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.673021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.673101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.673122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.673147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.673162 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.679376 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.694534 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.698274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.698318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.698328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.698349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.698361 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.710381 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.715032 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: E1127 16:38:34.715166 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.717210 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.717254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.717266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.717287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.717301 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.753161 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.792042 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.820548 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.820657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.820677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.820704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.820724 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.923632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.923734 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.923760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.923797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.923822 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:34Z","lastTransitionTime":"2025-11-27T16:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.924141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.928197 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d" exitCode=0 Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.928248 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d"} Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.944747 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.961482 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:34 crc kubenswrapper[4954]: I1127 16:38:34.977072 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:34.999916 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.022734 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.027621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.027684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.027701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.027724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.027781 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.038971 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.077941 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.112267 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.130717 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.130761 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.130770 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.130793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.130807 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.153281 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.188943 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.233912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.233964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.233981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.234001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.234016 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.235729 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.272597 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.311553 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.337365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.337452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.337477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.337504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.337529 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.355489 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.441332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.441388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.441407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.441428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.441440 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.544879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.544940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.544959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.544985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.545005 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.647793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.647859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.647878 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.647917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.647936 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.661775 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.661951 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.662071 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.662794 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.664559 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.664981 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.751845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.751918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.751937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.751967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.751988 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.855244 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.855282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.855294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.855310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.855321 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.867796 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.867983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.868151 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:43.868103311 +0000 UTC m=+35.885543631 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.868191 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.868346 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:43.868277315 +0000 UTC m=+35.885717775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.938655 4954 generic.go:334] "Generic (PLEG): container finished" podID="536fc833-8add-426d-9ed0-b63547d316e0" containerID="f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685" exitCode=0 Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.938896 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerDied","Data":"f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.959763 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.960085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.960113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.960123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.960143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.960157 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:35Z","lastTransitionTime":"2025-11-27T16:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.969897 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.969997 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.970068 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970220 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970289 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:43.970271692 +0000 UTC m=+35.987712002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970360 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970401 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970431 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970479 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:43.970464337 +0000 UTC m=+35.987904647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970610 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970638 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970660 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:35 crc kubenswrapper[4954]: E1127 16:38:35.970716 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:43.970689762 +0000 UTC m=+35.988130072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:35 crc kubenswrapper[4954]: I1127 16:38:35.979676 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.004000 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.021076 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.034867 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.048640 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.060181 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.062815 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.062901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.062920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.062947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.062969 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.076036 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.092912 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.107683 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.124740 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.142742 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.153980 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.166978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.167017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.167025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.167040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.167049 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.169485 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.275157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.275216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.275234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.275257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.275272 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.377715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.377750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.377757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.377771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.377781 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.480778 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.480822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.480833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.480851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.480865 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.583639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.584110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.584130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.584159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.584182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.687671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.687740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.687757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.687781 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.687799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.791030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.791076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.791089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.791141 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.791156 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.893411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.893506 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.893525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.893550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.893569 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.955263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.955775 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.955804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.968733 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" event={"ID":"536fc833-8add-426d-9ed0-b63547d316e0","Type":"ContainerStarted","Data":"c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.976803 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998412 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998448 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998482 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:36Z","lastTransitionTime":"2025-11-27T16:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:36 crc kubenswrapper[4954]: I1127 16:38:36.998492 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.045480 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.050484 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.051467 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.061196 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.077437 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.096703 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.102513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.102604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.102621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.102646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.102660 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.115852 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.130784 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.146766 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.161291 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.182099 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.198756 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.207893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.207925 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.207934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.207949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.207960 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.216197 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.227982 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.239751 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.254085 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.273393 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.290431 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.302169 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.310947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.310998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.311009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.311031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.311046 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.317641 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.338595 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.357275 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.375344 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.414542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.414606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.414624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.414648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.414663 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.415435 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.467682 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.498245 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.511270 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.517559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.517632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.517646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.517669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.517683 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.525618 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:37Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.621348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.621395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.621407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.621428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.621442 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.661941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.661965 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.662116 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:37 crc kubenswrapper[4954]: E1127 16:38:37.662215 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:37 crc kubenswrapper[4954]: E1127 16:38:37.662349 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:37 crc kubenswrapper[4954]: E1127 16:38:37.662677 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.900814 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.900900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.900917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.900942 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.900957 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:37Z","lastTransitionTime":"2025-11-27T16:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:37 crc kubenswrapper[4954]: I1127 16:38:37.972234 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.004897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.004959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.004972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.004993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.005366 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.108328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.108384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.108401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.108419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.108432 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.211726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.211796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.211820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.211856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.211879 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.313745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.313806 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.313818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.313836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.313849 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.416646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.416730 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.416748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.416780 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.416799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.429800 4954 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.520329 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.520395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.520413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.520443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.520466 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.623510 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.623568 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.623599 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.623621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.623635 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.682453 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.702925 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.720617 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.731019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.731096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.731115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.731143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.731162 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.752849 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.774342 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.796527 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.817002 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.837953 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.862741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.862796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.862806 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.862826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.862837 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.872713 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.894490 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.911670 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.937122 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.955144 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.972034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.972120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.972370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.972406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.972436 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:38Z","lastTransitionTime":"2025-11-27T16:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.976031 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:38:38 crc kubenswrapper[4954]: I1127 16:38:38.977749 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.075997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.076087 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.076115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.076157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.076185 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.179238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.179280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.179293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.179311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.179322 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.283301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.283347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.283358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.283378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.283391 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.386987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.387042 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.387052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.387076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.387097 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.489423 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.489471 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.489482 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.489533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.489546 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.592782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.592828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.592838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.592859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.592870 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.661840 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.661977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:39 crc kubenswrapper[4954]: E1127 16:38:39.662059 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:39 crc kubenswrapper[4954]: E1127 16:38:39.662431 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.662658 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:39 crc kubenswrapper[4954]: E1127 16:38:39.662816 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.695272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.695309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.695318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.695336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.695346 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.798696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.798755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.798770 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.798790 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.798806 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.902646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.902689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.902700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.902719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.902730 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:39Z","lastTransitionTime":"2025-11-27T16:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.983371 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/0.log" Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.988629 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b" exitCode=1 Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.988680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b"} Nov 27 16:38:39 crc kubenswrapper[4954]: I1127 16:38:39.990026 4954 scope.go:117] "RemoveContainer" containerID="aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.005244 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.005298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.005319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.005347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.005372 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.011094 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.037547 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.066129 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.088820 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.108031 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.109824 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.109890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.109911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.109956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.109975 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.136864 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.163084 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.184386 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.209657 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.216126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.216198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.216218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.216247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.216266 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.237926 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.258366 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.292423 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:39Z\\\",\\\"message\\\":\\\"pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:38:39.789646 6248 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:38:39.790072 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:39.790100 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:39.790139 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:39.790153 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:39.790668 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:38:39.790710 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:39.790767 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:38:39.790822 6248 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:38:39.790831 6248 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:38:39.790855 6248 factory.go:656] Stopping watch factory\\\\nI1127 16:38:39.790879 6248 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:39.790916 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:39.790945 6248 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.313373 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.319426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.319482 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.319493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.319514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.319527 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.331039 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:40Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.423094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.423161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.423177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.423204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.423221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.526614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.526679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.526698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.526726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.526745 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.580199 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.629384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.629434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.629447 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.629468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.629510 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.732233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.732310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.732325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.732343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.732389 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.835330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.835389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.835401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.835423 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.835437 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.938641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.938710 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.938729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.938755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.938776 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:40Z","lastTransitionTime":"2025-11-27T16:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.994133 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/0.log" Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.998892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c"} Nov 27 16:38:40 crc kubenswrapper[4954]: I1127 16:38:40.999389 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.022559 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.036892 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.041375 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.041423 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.041435 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.041453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.041464 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.061690 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:39Z\\\",\\\"message\\\":\\\"pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:38:39.789646 6248 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:38:39.790072 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:39.790100 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:39.790139 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:39.790153 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:39.790668 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:38:39.790710 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:39.790767 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:38:39.790822 6248 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:38:39.790831 6248 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:38:39.790855 6248 factory.go:656] Stopping watch factory\\\\nI1127 16:38:39.790879 6248 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:39.790916 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:39.790945 6248 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.073825 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.077956 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.084974 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.102645 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.125330 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.145304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.145351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.145362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.145380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.145393 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.149092 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.169132 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.187367 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.203400 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.226892 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.243986 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.248551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.248870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.248944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.249017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.249085 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.268246 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.282416 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.297381 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.311009 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.329394 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352176 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.352012 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.366055 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.387208 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.410175 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:39Z\\\",\\\"message\\\":\\\"pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:38:39.789646 6248 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:38:39.790072 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:39.790100 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:39.790139 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:39.790153 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:39.790668 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:38:39.790710 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:39.790767 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:38:39.790822 6248 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:38:39.790831 6248 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:38:39.790855 6248 factory.go:656] Stopping watch factory\\\\nI1127 16:38:39.790879 6248 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:39.790916 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:39.790945 6248 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.429671 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.442704 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.454523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.454603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.454621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.454668 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.454689 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.460036 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.474863 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.493095 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.508647 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:41Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.557652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.557702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.557716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.557736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.557751 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.661095 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:41 crc kubenswrapper[4954]: E1127 16:38:41.661549 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.661147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.662130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.662334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.661154 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.661090 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.662529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.662757 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: E1127 16:38:41.662820 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:41 crc kubenswrapper[4954]: E1127 16:38:41.662998 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.766344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.766406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.766422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.766445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.766465 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.870219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.870290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.870310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.870336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.870355 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.973915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.974013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.974035 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.974065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:41 crc kubenswrapper[4954]: I1127 16:38:41.974085 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:41Z","lastTransitionTime":"2025-11-27T16:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.006026 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/1.log" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.007016 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/0.log" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.010971 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c" exitCode=1 Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.011030 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.011113 4954 scope.go:117] "RemoveContainer" containerID="aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.012266 4954 scope.go:117] "RemoveContainer" containerID="aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c" Nov 27 16:38:42 crc kubenswrapper[4954]: E1127 16:38:42.012555 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.034919 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.059055 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.079501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.079623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.079657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.079690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.079713 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.087009 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.104162 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.127156 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.149277 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.175260 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.182486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.182717 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.182788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.182877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.182947 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.193493 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.214214 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.230854 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.253351 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286470 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.286663 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:39Z\\\",\\\"message\\\":\\\"pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:38:39.789646 6248 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:38:39.790072 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:39.790100 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:39.790139 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:39.790153 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:39.790668 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:38:39.790710 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:39.790767 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:38:39.790822 6248 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:38:39.790831 6248 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:38:39.790855 6248 factory.go:656] Stopping watch factory\\\\nI1127 16:38:39.790879 6248 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:39.790916 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:39.790945 6248 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.305762 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.329468 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.389989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.390040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.390051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.390071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.390084 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.465278 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm"] Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.466095 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.469316 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.469435 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.493797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.493845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.493861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.493883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.493900 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.496663 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.518486 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.536855 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.559809 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.559883 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.559920 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474d40a8-ea36-4785-8818-6beb58074208-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.559962 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvbt\" (UniqueName: \"kubernetes.io/projected/474d40a8-ea36-4785-8818-6beb58074208-kube-api-access-rcvbt\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.561095 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.580049 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.597364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.597419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.597438 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.597468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.597492 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.603720 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.623914 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.646141 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.661464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.661550 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.661645 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474d40a8-ea36-4785-8818-6beb58074208-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.661707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvbt\" (UniqueName: \"kubernetes.io/projected/474d40a8-ea36-4785-8818-6beb58074208-kube-api-access-rcvbt\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.662423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.666454 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474d40a8-ea36-4785-8818-6beb58074208-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.671819 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.674417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474d40a8-ea36-4785-8818-6beb58074208-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.693553 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvbt\" (UniqueName: \"kubernetes.io/projected/474d40a8-ea36-4785-8818-6beb58074208-kube-api-access-rcvbt\") pod \"ovnkube-control-plane-749d76644c-j2bxm\" (UID: \"474d40a8-ea36-4785-8818-6beb58074208\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.699410 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.699918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.699971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.699991 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.700015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.700034 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.723322 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.743039 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.776132 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa1bdaf80e2a227dff494aee8bbc23b34b6db52159b3dd3554473722c8b43e1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:39Z\\\",\\\"message\\\":\\\"pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:38:39.789646 6248 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:38:39.790072 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:39.790100 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:39.790139 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:39.790153 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:39.790668 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:38:39.790710 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:39.790767 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:38:39.790822 6248 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:38:39.790831 6248 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:38:39.790855 6248 factory.go:656] Stopping watch factory\\\\nI1127 16:38:39.790879 6248 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:39.790916 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:39.790945 6248 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.795060 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.798318 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.803021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.803106 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.803134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.803163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.803185 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.819099 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.910739 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.910820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.910839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.910876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:42 crc kubenswrapper[4954]: I1127 16:38:42.910896 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:42Z","lastTransitionTime":"2025-11-27T16:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.013949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.014019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.014045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.014076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.014101 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.027844 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/1.log" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.034915 4954 scope.go:117] "RemoveContainer" containerID="aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.035818 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.039575 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" event={"ID":"474d40a8-ea36-4785-8818-6beb58074208","Type":"ContainerStarted","Data":"af1b91bab16680022b46b38c98e2b3957264b60b7acacea93e9b8f289f4997b1"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.058384 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.078019 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.091675 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.109720 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.125256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.125326 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.125355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.125386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.125411 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.132383 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.154885 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.180416 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.200019 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.223826 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.229872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.229943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.230184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.230229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.230258 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.241552 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.271278 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.285085 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.301128 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.312730 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.333114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.333164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.333174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.333190 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.333203 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.337207 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:43Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.436086 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.436155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.436174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.436202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.436221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.540068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.540134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.540153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.540181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.540201 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.643557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.644118 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.644137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.644164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.644183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.661553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.661607 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.661689 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.661800 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.662001 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.662194 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.747550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.747661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.747686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.747717 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.747740 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.851375 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.851443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.851461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.851488 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.851508 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.876223 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.876442 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.876397519 +0000 UTC m=+51.893837849 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.876573 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.876773 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.876867 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.876847181 +0000 UTC m=+51.894287521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.955174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.955229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.955246 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.955271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.955291 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:43Z","lastTransitionTime":"2025-11-27T16:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.978541 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.978724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.978773 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.978887 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.978854898 +0000 UTC m=+51.996295438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: I1127 16:38:43.978779 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.978972 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979014 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979040 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979087 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979154 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979182 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979130 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.979097693 +0000 UTC m=+51.996538023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:43 crc kubenswrapper[4954]: E1127 16:38:43.979322 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.979292028 +0000 UTC m=+51.996732558 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.008967 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hgsvh"] Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.009933 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: E1127 16:38:44.010050 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.033260 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.047629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" event={"ID":"474d40a8-ea36-4785-8818-6beb58074208","Type":"ContainerStarted","Data":"75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.047707 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" event={"ID":"474d40a8-ea36-4785-8818-6beb58074208","Type":"ContainerStarted","Data":"711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.053688 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.057902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.057956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.057974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.057999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.058019 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.076574 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.080221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.080522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6vq\" (UniqueName: \"kubernetes.io/projected/af5183f4-5f46-4d64-8ec4-c7b71530cad6-kube-api-access-9s6vq\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.097367 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.127507 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.150939 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.161206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.161273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.161292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.161319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.161340 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.170176 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.181900 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6vq\" (UniqueName: \"kubernetes.io/projected/af5183f4-5f46-4d64-8ec4-c7b71530cad6-kube-api-access-9s6vq\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.182054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: E1127 16:38:44.182336 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:44 crc kubenswrapper[4954]: E1127 16:38:44.182447 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:44.682414142 +0000 UTC m=+36.699854472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.189996 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.210819 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.221477 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6vq\" (UniqueName: \"kubernetes.io/projected/af5183f4-5f46-4d64-8ec4-c7b71530cad6-kube-api-access-9s6vq\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.228398 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.242370 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.256124 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.263810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.264054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.264074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.264102 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.264122 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.270429 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.294391 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.329231 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.350125 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.367502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.367553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.367567 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.367603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.367618 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.385010 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.400143 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.420308 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.436994 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.453415 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.471349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.471399 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.471413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.471435 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.471448 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.481095 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.503914 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.522258 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.542233 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.560433 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.574269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.574318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.574331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.574349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.574361 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.579329 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.598904 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.619432 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.635360 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.651782 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.664704 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:44Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.676827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.676899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.676918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.676947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.676966 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.688510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:44 crc kubenswrapper[4954]: E1127 16:38:44.688713 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:44 crc kubenswrapper[4954]: E1127 16:38:44.688796 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:45.688771402 +0000 UTC m=+37.706211712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.780779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.780839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.780858 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.780885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.780903 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.883514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.883615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.883633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.883661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.883682 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.979249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.979311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.982137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.982234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:44 crc kubenswrapper[4954]: I1127 16:38:44.982264 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:44Z","lastTransitionTime":"2025-11-27T16:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.007921 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.015155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.015237 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.015270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.015303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.015330 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.035455 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.042013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.042055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.042067 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.042090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.042107 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.059560 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.065701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.065807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.065837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.065876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.065902 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.082575 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.087357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.087429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.087449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.087480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.087500 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.104963 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.105087 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.107429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.107467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.107479 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.107496 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.107510 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.210553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.210671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.210686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.210711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.210727 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.315021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.315096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.315112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.315661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.315715 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.419413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.419495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.419514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.419542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.419562 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.521962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.522004 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.522015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.522035 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.522047 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.625010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.625045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.625055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.625073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.625084 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.661969 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.662127 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.662630 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.662701 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.662652 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.662819 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.662919 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.663197 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.700346 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.700569 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:45 crc kubenswrapper[4954]: E1127 16:38:45.700766 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:47.700721874 +0000 UTC m=+39.718162364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.728180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.728256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.728280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.728468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.728494 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.831776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.831839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.831853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.831873 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.831887 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.937089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.937155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.937168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.937191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:45 crc kubenswrapper[4954]: I1127 16:38:45.937206 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:45Z","lastTransitionTime":"2025-11-27T16:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.039434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.039473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.039484 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.039501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.039518 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.144021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.144068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.144080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.144100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.144115 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.248109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.248171 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.248191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.248217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.248237 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.352220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.352294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.352312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.352340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.352360 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.457452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.457507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.457523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.457549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.457591 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.562429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.562500 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.562519 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.562547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.562568 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.665955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.666031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.666055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.666089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.666114 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.769295 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.769385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.769411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.769444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.769471 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.873411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.873480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.873498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.873528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.873549 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.977134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.977188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.977205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.977225 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:46 crc kubenswrapper[4954]: I1127 16:38:46.977239 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:46Z","lastTransitionTime":"2025-11-27T16:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.090148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.090216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.090238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.090264 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.090284 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.193478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.193529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.193546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.193568 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.193621 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.297093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.297176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.297197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.297227 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.297249 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.400810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.400873 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.400890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.400915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.400932 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.503811 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.503862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.503872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.503889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.503900 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.607756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.607828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.608027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.608220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.608254 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.662218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.662332 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.662380 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.662546 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.662630 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.662823 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.663023 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.663274 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.712224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.712296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.712320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.712351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.712369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.723301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.723511 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:47 crc kubenswrapper[4954]: E1127 16:38:47.723623 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:51.723556954 +0000 UTC m=+43.740997294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.816545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.816643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.816666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.816691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.816708 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.919900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.920199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.920267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.920332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:47 crc kubenswrapper[4954]: I1127 16:38:47.920396 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:47Z","lastTransitionTime":"2025-11-27T16:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.023632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.023702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.023721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.023749 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.023770 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.128228 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.128291 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.128302 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.128322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.128335 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.231955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.232010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.232019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.232037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.232047 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.335840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.335911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.335927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.335952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.335968 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.438331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.438379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.438388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.438401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.438410 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.540882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.540940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.540993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.541022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.541037 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.644547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.644719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.644736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.644765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.644785 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.679708 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.697230 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.719706 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.737393 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.748021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.748072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.748091 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.748116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.748135 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.754851 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.777436 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.799924 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.832517 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.852512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.852635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.852669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.852696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.852711 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.857615 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.882501 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.903768 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.922246 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.954212 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.958128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.958196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.958217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.958247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.958268 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:48Z","lastTransitionTime":"2025-11-27T16:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.972180 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:48 crc kubenswrapper[4954]: I1127 16:38:48.996325 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.013293 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:49Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.061654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.061741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.061760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.061791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.061810 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.165062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.165134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.165153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.165180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.165201 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.268758 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.268934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.268981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.269013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.269035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.372647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.372693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.372705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.372723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.372736 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.476346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.476417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.476434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.476458 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.476478 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.579741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.579789 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.579801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.579821 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.579834 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.661821 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.661885 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.661921 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:49 crc kubenswrapper[4954]: E1127 16:38:49.661996 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.662111 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:49 crc kubenswrapper[4954]: E1127 16:38:49.662254 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:49 crc kubenswrapper[4954]: E1127 16:38:49.662468 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:49 crc kubenswrapper[4954]: E1127 16:38:49.662733 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.682526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.682627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.682644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.682669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.682685 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.786782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.786822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.786833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.786851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.786863 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.890317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.890380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.890397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.890422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.890440 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.994292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.994351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.994365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.994384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:49 crc kubenswrapper[4954]: I1127 16:38:49.994400 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:49Z","lastTransitionTime":"2025-11-27T16:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.097626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.097688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.097707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.097737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.097756 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.200734 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.200797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.200812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.200836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.200853 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.303959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.304034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.304053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.304080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.304098 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.407638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.407728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.407751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.407780 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.407804 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.510915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.511381 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.511565 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.512147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.512309 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.616643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.617131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.617756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.618157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.618732 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.721872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.722231 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.722405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.722556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.722732 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.826768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.826885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.826910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.826942 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.826964 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.930559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.930677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.930700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.930762 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:50 crc kubenswrapper[4954]: I1127 16:38:50.930790 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:50Z","lastTransitionTime":"2025-11-27T16:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.035069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.035522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.035757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.036019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.036218 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.140855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.141659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.141898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.142082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.142223 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.246690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.246768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.246788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.246818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.246966 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.349062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.349101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.349110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.349123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.349136 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.452129 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.452199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.452224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.452250 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.452269 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.555235 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.555338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.555357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.555383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.555400 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.658763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.658812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.658830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.658852 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.658870 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.661829 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.661884 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.661996 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.662074 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.662128 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.662295 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.662523 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.662652 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.762627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.762695 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.762712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.762736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.762755 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.773531 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.773855 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:51 crc kubenswrapper[4954]: E1127 16:38:51.774164 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:38:59.774123261 +0000 UTC m=+51.791563591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.865975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.866062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.866088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.866121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.866145 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.970031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.970100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.970119 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.970148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:51 crc kubenswrapper[4954]: I1127 16:38:51.970166 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:51Z","lastTransitionTime":"2025-11-27T16:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.074101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.074204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.074227 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.074259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.074284 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.178163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.178222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.178230 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.178261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.178270 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.281097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.281161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.281180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.281205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.281226 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.384223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.384299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.384317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.384350 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.384369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.488540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.488634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.488653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.488684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.488703 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.591219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.591284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.591301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.591326 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.591344 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.693954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.694060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.694086 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.694112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.694132 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.796928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.797001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.797025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.797057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.797080 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.899594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.899667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.899677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.899693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:52 crc kubenswrapper[4954]: I1127 16:38:52.899702 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:52Z","lastTransitionTime":"2025-11-27T16:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.003270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.003339 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.003361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.003396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.003418 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.106810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.106913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.106932 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.106958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.106978 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.210906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.210975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.210992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.211022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.211045 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.314801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.314883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.314907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.314937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.315006 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.418043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.418400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.418673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.418829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.418963 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.523109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.523185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.523383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.523402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.523416 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.627424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.627476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.627487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.627507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.627521 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.661261 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.661302 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.661415 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.661641 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:53 crc kubenswrapper[4954]: E1127 16:38:53.661642 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:53 crc kubenswrapper[4954]: E1127 16:38:53.661826 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:53 crc kubenswrapper[4954]: E1127 16:38:53.661973 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:53 crc kubenswrapper[4954]: E1127 16:38:53.662145 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.730284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.730355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.730374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.730400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.730419 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.833426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.833496 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.833516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.833545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.833567 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.937032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.937101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.937124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.937150 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:53 crc kubenswrapper[4954]: I1127 16:38:53.937167 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:53Z","lastTransitionTime":"2025-11-27T16:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.040931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.041017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.041039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.041067 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.041092 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.144813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.144900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.144926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.144961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.144987 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.248498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.248587 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.248643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.248676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.248697 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.353013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.353866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.353960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.354049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.354159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.457659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.457743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.457773 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.457866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.457896 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.561877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.561937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.561954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.561980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.561999 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.663534 4954 scope.go:117] "RemoveContainer" containerID="aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.665492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.665549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.665566 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.665622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.665647 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.770156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.770737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.770758 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.770787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.770808 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.874556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.874648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.874666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.874690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.874708 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.978271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.978324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.978335 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.978354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:54 crc kubenswrapper[4954]: I1127 16:38:54.978369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:54Z","lastTransitionTime":"2025-11-27T16:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.082368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.082437 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.082455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.082487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.082511 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.092622 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/1.log" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.098766 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.099304 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.122716 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.143088 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.148775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.148826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.148837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.148856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.148873 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.162173 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.176711 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.182381 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.184652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.184720 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.184736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.184755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.184767 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.202243 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.208277 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.212158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.212206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.212220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.212242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.212261 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.232499 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.235376 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.240015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.240071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.240083 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.240105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.240119 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.248864 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.254828 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.264062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.264133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.264148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.264174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.264195 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.271987 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.290268 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.290452 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.292400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.292472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.292491 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.292521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.292541 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.297783 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.313802 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.327627 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.343830 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.356093 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.367082 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.386354 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.395233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.395294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.395314 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.395340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.395357 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.403205 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.498455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.498502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.498512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.498528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.498540 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.601626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.601681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.601690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.601706 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.601718 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.661716 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.661784 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.661828 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.661858 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.661731 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.662055 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.662288 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:55 crc kubenswrapper[4954]: E1127 16:38:55.662269 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.703707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.703748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.703758 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.703772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.703784 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.806856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.806913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.806930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.806953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.806971 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.909899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.909983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.910000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.910025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:55 crc kubenswrapper[4954]: I1127 16:38:55.910043 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:55Z","lastTransitionTime":"2025-11-27T16:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.012862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.012934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.012962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.012994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.013019 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.107268 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/2.log" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.108277 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/1.log" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.112123 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" exitCode=1 Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.112185 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.112240 4954 scope.go:117] "RemoveContainer" containerID="aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.113707 4954 scope.go:117] "RemoveContainer" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" Nov 27 16:38:56 crc kubenswrapper[4954]: E1127 16:38:56.113988 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.115347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.115402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.115424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.115454 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.115477 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.134736 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.153932 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.171384 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.191113 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.206309 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.218533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.218628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.218649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.218674 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.218695 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.227405 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.241773 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.256344 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.271980 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.290841 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.310141 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.323453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.323528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.323550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.323583 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.323635 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.333983 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.351515 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.382437 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.405990 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428532 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.428725 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.531891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.531964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.531989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.532021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.532046 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.635895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.635938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.635948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.635966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.635980 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.739403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.739481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.739503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.739531 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.739550 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.843424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.843488 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.843507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.843535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.843555 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.946895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.946961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.946985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.947017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:56 crc kubenswrapper[4954]: I1127 16:38:56.947041 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:56Z","lastTransitionTime":"2025-11-27T16:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.019031 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.032042 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.038782 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.050756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.050836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.050877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.050915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.050939 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.061509 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.087816 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.109717 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.119115 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/2.log" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.122987 4954 scope.go:117] "RemoveContainer" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" Nov 27 16:38:57 crc kubenswrapper[4954]: E1127 16:38:57.123229 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.124900 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.140388 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153170 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.153764 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.165376 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.177691 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.190545 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.208851 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.223042 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.243458 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa73b49f13468ea65ed5e0a36611f95071dadbbe2e7c2c1205d6bd4ae166da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"message\\\":\\\"cluster-version-operator\\\\\\\"}\\\\nI1127 16:38:41.013837 6368 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.851487ms\\\\nI1127 16:38:41.013842 6368 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.53716ms\\\\nI1127 16:38:41.013889 6368 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:38:41.014033 6368 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:38:41.014077 6368 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:38:41.014084 6368 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:38:41.014136 6368 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:38:41.014195 6368 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:38:41.014243 6368 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:38:41.014298 6368 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:38:41.014308 6368 factory.go:656] Stopping watch factory\\\\nI1127 16:38:41.014397 6368 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:41.014471 6368 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:41.014653 6368 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255665 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.255996 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.269243 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.278320 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.295032 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.305598 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.323252 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.339510 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.359311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.359372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.359391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.359420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.359439 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.366174 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.383013 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.396707 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.414273 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.426187 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.439832 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.451395 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.461952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.461988 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.461999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.462016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.462029 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.466569 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.483460 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.502677 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.516870 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.532265 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.544534 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.564511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.564554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.564564 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.564607 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.564621 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.661442 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.661442 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.661501 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:57 crc kubenswrapper[4954]: E1127 16:38:57.662055 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:57 crc kubenswrapper[4954]: E1127 16:38:57.661881 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.661550 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:57 crc kubenswrapper[4954]: E1127 16:38:57.662136 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:57 crc kubenswrapper[4954]: E1127 16:38:57.662410 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.667089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.667260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.667355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.667482 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.667628 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.770811 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.771138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.771270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.771409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.771726 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.874913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.874996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.875018 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.875053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.875084 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.979834 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.979900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.979921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.979949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:57 crc kubenswrapper[4954]: I1127 16:38:57.979971 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:57Z","lastTransitionTime":"2025-11-27T16:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.082534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.082655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.082676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.082707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.082731 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.185670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.185715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.185731 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.185751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.185765 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.288516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.288914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.289052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.289221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.289344 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.392801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.393342 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.393443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.393539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.393633 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.496869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.497161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.497310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.497421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.497517 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.601952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.602035 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.602059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.602091 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.602113 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.685624 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.705751 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.707035 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.709242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.709475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.709876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.710115 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.726698 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.749713 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.772800 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.795547 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.814303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.814355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.814367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.814387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.814401 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.817653 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.845644 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.863174 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.881635 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.899728 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.917637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.917705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.917724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.917753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.917773 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:58Z","lastTransitionTime":"2025-11-27T16:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.921495 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.941927 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.963890 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:58 crc kubenswrapper[4954]: I1127 16:38:58.988401 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:58Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.016638 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:59Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.022019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.022249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.022421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.022560 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.022731 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.035934 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:38:59Z is after 2025-08-24T17:21:41Z" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.126899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.126961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.126982 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.127009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.127031 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.232419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.232937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.233189 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.233449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.233688 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.338714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.339122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.339266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.339413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.339537 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.442273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.443046 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.443297 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.443543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.443845 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.548718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.548786 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.548809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.548851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.548872 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.652516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.652873 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.652970 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.653064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.653143 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.661828 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.662021 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.662054 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.661863 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.662293 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.662408 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.662553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.662770 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.756533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.756621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.756638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.756662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.756681 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.859892 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.859946 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.859965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.859988 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.860006 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.874939 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.875115 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.875223 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:15.875189401 +0000 UTC m=+67.892629731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.962944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.962998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.963017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.963041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.963059 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:38:59Z","lastTransitionTime":"2025-11-27T16:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.975671 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:38:59 crc kubenswrapper[4954]: I1127 16:38:59.975868 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.976031 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.976041 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:39:31.97601 +0000 UTC m=+83.993450340 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:38:59 crc kubenswrapper[4954]: E1127 16:38:59.976108 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:31.976089262 +0000 UTC m=+83.993529602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.067103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.067164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.067181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.067211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.067230 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.076306 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.076393 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.076450 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076647 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076673 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076695 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076777 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:32.076745968 +0000 UTC m=+84.094186308 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076906 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.076948 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.077035 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.077132 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:32.077100926 +0000 UTC m=+84.094541266 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.077913 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: E1127 16:39:00.078006 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:32.077985607 +0000 UTC m=+84.095425937 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.170478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.170829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.170970 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.171141 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.171288 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.274973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.275066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.275087 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.275127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.275149 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.379209 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.379293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.379314 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.379344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.379367 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.482051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.482112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.482135 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.482167 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.482188 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.584896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.584975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.584995 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.585023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.585046 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.689108 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.689187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.689205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.689231 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.689249 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.791871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.791930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.791939 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.791971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.791983 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.895904 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.895972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.895989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.896013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.896034 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.999368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.999798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.999820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:00 crc kubenswrapper[4954]: I1127 16:39:00.999848 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:00.999874 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:00Z","lastTransitionTime":"2025-11-27T16:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.103212 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.103295 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.103319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.103348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.103377 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.208201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.208309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.208327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.208358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.208382 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.313492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.313650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.313671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.313697 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.313716 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.416944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.417023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.417048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.417081 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.417105 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.519848 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.519926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.519949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.519985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.520045 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.622776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.622833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.622850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.622873 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.622891 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.661325 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.661368 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.661352 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.661429 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:01 crc kubenswrapper[4954]: E1127 16:39:01.661617 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:01 crc kubenswrapper[4954]: E1127 16:39:01.661852 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:01 crc kubenswrapper[4954]: E1127 16:39:01.661958 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:01 crc kubenswrapper[4954]: E1127 16:39:01.662116 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.726787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.726852 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.726877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.726907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.726930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.830250 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.830349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.830376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.830408 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.830430 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.935174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.935266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.935290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.935324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:01 crc kubenswrapper[4954]: I1127 16:39:01.935350 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:01Z","lastTransitionTime":"2025-11-27T16:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.039348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.039450 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.039475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.039507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.039530 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.143343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.143416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.143433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.143459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.143481 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.247144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.247244 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.247268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.247301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.247324 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.350026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.350086 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.350103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.350128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.350145 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.453677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.453733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.453745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.453765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.453779 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.557172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.557288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.557310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.557352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.557377 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.659973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.660032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.660042 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.660062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.660073 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.763115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.763193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.763214 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.763244 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.763270 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.867208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.867269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.867288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.867314 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.867334 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.970566 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.970639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.970649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.970664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:02 crc kubenswrapper[4954]: I1127 16:39:02.970868 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:02Z","lastTransitionTime":"2025-11-27T16:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.074521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.074573 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.074601 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.074621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.074635 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.177905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.178019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.178043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.178076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.178100 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.280760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.280841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.280862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.280889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.280916 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.384378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.384440 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.384456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.384483 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.384505 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.487928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.488026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.488045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.488077 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.488098 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.590833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.590908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.590928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.590969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.590988 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.661385 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.661450 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.661490 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.661632 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:03 crc kubenswrapper[4954]: E1127 16:39:03.661637 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:03 crc kubenswrapper[4954]: E1127 16:39:03.661829 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:03 crc kubenswrapper[4954]: E1127 16:39:03.661945 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:03 crc kubenswrapper[4954]: E1127 16:39:03.662087 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.694356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.694407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.694425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.694453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.694472 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.797319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.797383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.797400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.797429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.797450 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.900901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.900964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.900984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.901009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:03 crc kubenswrapper[4954]: I1127 16:39:03.901026 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:03Z","lastTransitionTime":"2025-11-27T16:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.004233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.004331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.004357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.004395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.004419 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.113701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.113779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.113799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.113824 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.113844 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.217597 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.217646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.217656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.217673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.217684 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.321411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.321456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.321468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.321486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.321499 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.425229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.425269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.425282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.425298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.425308 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.528619 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.528666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.528678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.528697 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.528710 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.632976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.633049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.633067 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.633094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.633114 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.737022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.737092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.737115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.737147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.737166 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.840507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.840706 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.840731 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.840765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.840788 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.943894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.943964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.943981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.944013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:04 crc kubenswrapper[4954]: I1127 16:39:04.944031 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:04Z","lastTransitionTime":"2025-11-27T16:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.048089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.048156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.048174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.048203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.048224 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.151290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.151346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.151363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.151389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.151409 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.255032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.255103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.255126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.255157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.255182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.344486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.344561 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.344617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.344646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.344665 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.366704 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.372903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.372955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.372971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.372994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.373013 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.394416 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.399751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.399835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.399859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.399890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.399909 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.422496 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.428800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.428860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.428885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.428916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.428942 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.451248 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.458615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.458671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.458688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.458714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.458732 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.479208 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.479553 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.481989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.482048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.482066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.482089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.482109 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.585178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.585238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.585261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.585305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.585323 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.661532 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.661540 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.661660 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.661678 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.661866 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.662074 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.662252 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:05 crc kubenswrapper[4954]: E1127 16:39:05.662370 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.689359 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.689424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.689445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.689475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.689497 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.792974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.793038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.793054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.793089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.793144 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.896907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.896985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.897003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.897032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:05 crc kubenswrapper[4954]: I1127 16:39:05.897050 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:05Z","lastTransitionTime":"2025-11-27T16:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.000784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.000853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.000877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.000907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.000928 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.104153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.104228 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.104246 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.104277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.104299 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.207853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.207927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.207951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.207988 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.208015 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.310467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.310541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.310559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.310615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.310635 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.413968 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.414039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.414057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.414084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.414102 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.517631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.517698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.517714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.517742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.517761 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.620746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.620807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.620833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.620864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.620886 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.723918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.723958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.723969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.723984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.723996 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.827253 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.827311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.827330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.827358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.827380 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.930672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.931081 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.931264 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.931430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:06 crc kubenswrapper[4954]: I1127 16:39:06.931619 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:06Z","lastTransitionTime":"2025-11-27T16:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.035182 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.035242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.035259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.035287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.035308 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.139915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.140700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.140962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.141203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.141373 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.245069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.245144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.245165 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.245198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.245221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.348111 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.348167 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.348184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.348208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.348228 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.457757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.457832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.457857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.457892 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.457914 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.561168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.561249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.561270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.561304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.561377 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.662028 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.662130 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.661793 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.663042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:07 crc kubenswrapper[4954]: E1127 16:39:07.663202 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:07 crc kubenswrapper[4954]: E1127 16:39:07.663367 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:07 crc kubenswrapper[4954]: E1127 16:39:07.663548 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:07 crc kubenswrapper[4954]: E1127 16:39:07.663713 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.664353 4954 scope.go:117] "RemoveContainer" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" Nov 27 16:39:07 crc kubenswrapper[4954]: E1127 16:39:07.664939 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.665414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.665475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.665497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.665540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.665568 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.769463 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.769535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.769553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.769576 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.769627 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.874222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.874304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.874324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.874355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.874376 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.978242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.978293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.978310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.978334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:07 crc kubenswrapper[4954]: I1127 16:39:07.978352 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:07Z","lastTransitionTime":"2025-11-27T16:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.080951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.081007 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.081023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.081047 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.081065 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.183898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.183974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.183991 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.184016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.184033 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.287080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.287143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.287164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.287195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.287218 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.391402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.391480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.391502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.391532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.391553 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.495506 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.495574 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.495628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.495654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.495671 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.599331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.599396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.599414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.599439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.599459 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.681185 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.702995 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.703112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.703132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.703195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.703216 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.703374 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.727643 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.747710 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.769677 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.793031 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.806467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.806563 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.806755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.806853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.807084 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.818689 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.841413 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.862822 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.884752 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.906918 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.910771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.910838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.910854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.910884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.910904 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:08Z","lastTransitionTime":"2025-11-27T16:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.929166 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.948695 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:08 crc kubenswrapper[4954]: I1127 16:39:08.980976 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:08.999896 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.013788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.013855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.013871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.013899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.013918 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.020180 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.036786 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.116738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.117166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.117358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.117536 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.117756 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.221168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.221232 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.221249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.221274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.221292 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.325040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.325110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.325134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.325161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.325179 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.428513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.428562 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.428605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.428636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.428663 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.531701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.531799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.531817 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.531843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.531864 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.635185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.635256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.635274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.635299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.635315 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.662131 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.662157 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.662189 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.662145 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:09 crc kubenswrapper[4954]: E1127 16:39:09.662369 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:09 crc kubenswrapper[4954]: E1127 16:39:09.662465 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:09 crc kubenswrapper[4954]: E1127 16:39:09.662632 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:09 crc kubenswrapper[4954]: E1127 16:39:09.662782 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.738301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.738352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.738370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.738395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.738413 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.842277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.842431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.842453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.842518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.842540 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.945620 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.945694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.945712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.945740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:09 crc kubenswrapper[4954]: I1127 16:39:09.945760 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:09Z","lastTransitionTime":"2025-11-27T16:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.048070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.048131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.048148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.048173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.048195 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.151430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.151565 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.151621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.151650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.151669 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.255012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.255089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.255114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.255146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.255172 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.359195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.359267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.359284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.359311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.359331 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.462525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.462648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.462678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.462707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.462729 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.567138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.568261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.568481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.568674 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.568826 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.671630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.671712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.671730 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.671753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.671771 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.775805 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.775877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.775898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.775921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.775939 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.879875 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.879930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.879945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.879974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.879989 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.983241 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.983702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.983902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.984124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:10 crc kubenswrapper[4954]: I1127 16:39:10.984373 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:10Z","lastTransitionTime":"2025-11-27T16:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.087900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.087953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.088001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.088027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.088043 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.190864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.190906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.190917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.190935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.190947 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.294439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.294515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.294534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.294559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.294606 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.397398 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.397454 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.397470 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.397556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.397610 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.501036 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.501122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.501140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.501197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.501216 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.604897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.605333 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.605464 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.605621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.605767 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.661454 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.661564 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:11 crc kubenswrapper[4954]: E1127 16:39:11.661694 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.661465 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:11 crc kubenswrapper[4954]: E1127 16:39:11.661879 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.661491 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:11 crc kubenswrapper[4954]: E1127 16:39:11.662078 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:11 crc kubenswrapper[4954]: E1127 16:39:11.662164 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.708207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.708499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.708666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.708782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.708873 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.812997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.813362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.813525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.813717 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.814006 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.917814 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.919621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.919770 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.919913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:11 crc kubenswrapper[4954]: I1127 16:39:11.920053 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:11Z","lastTransitionTime":"2025-11-27T16:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.023139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.023529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.023835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.024050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.024221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.128203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.128550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.128750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.128888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.128975 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.231866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.231922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.231935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.231959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.231973 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.334870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.334940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.334959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.334989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.335008 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.437838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.437906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.437922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.437945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.437963 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.542461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.542544 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.542653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.542716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.542745 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.646439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.646531 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.646558 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.646630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.646657 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.750000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.750069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.750088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.750114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.750134 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.853596 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.854038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.854107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.854187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.854260 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.957127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.957172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.957184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.957203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:12 crc kubenswrapper[4954]: I1127 16:39:12.957216 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:12Z","lastTransitionTime":"2025-11-27T16:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.060143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.060203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.060219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.060239 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.060252 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.163615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.163659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.163673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.163692 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.163705 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.267136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.267229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.267249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.267280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.267302 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.370947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.371038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.371062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.371094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.371116 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.473613 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.473658 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.473672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.473700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.473715 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.576600 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.576641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.576652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.576673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.576687 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.661060 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.661090 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.661240 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.661401 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:13 crc kubenswrapper[4954]: E1127 16:39:13.661392 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:13 crc kubenswrapper[4954]: E1127 16:39:13.661536 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:13 crc kubenswrapper[4954]: E1127 16:39:13.661751 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:13 crc kubenswrapper[4954]: E1127 16:39:13.661913 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.679131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.679188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.679202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.679219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.679230 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.781917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.781969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.781980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.782001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.782012 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.885364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.885408 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.885419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.885436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.885447 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.988213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.988273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.988286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.988305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:13 crc kubenswrapper[4954]: I1127 16:39:13.988317 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:13Z","lastTransitionTime":"2025-11-27T16:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.091144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.091178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.091189 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.091206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.091216 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.193559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.193630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.193642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.193679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.193696 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.297337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.297401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.297416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.297433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.297446 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.400327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.400376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.400389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.400407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.400419 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.503319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.503367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.503379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.503398 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.503409 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.605875 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.605921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.605931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.605948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.605958 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.708772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.708845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.708864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.708893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.708921 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.811889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.811944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.811953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.811979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.811993 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.914636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.914704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.914722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.914748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:14 crc kubenswrapper[4954]: I1127 16:39:14.914766 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:14Z","lastTransitionTime":"2025-11-27T16:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.018418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.018459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.018471 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.018489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.018501 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.120838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.120888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.120897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.120912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.120922 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.223470 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.223635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.223659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.223689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.223708 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.327985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.328545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.328735 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.328899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.329036 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.432782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.432833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.432845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.432867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.432881 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.526307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.526357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.526365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.526383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.526396 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.549337 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.555224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.555268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.555305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.555325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.555342 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.573782 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.578840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.578897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.578906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.578921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.578946 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.594282 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.599538 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.599603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.599612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.599628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.599637 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.620225 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.626108 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.626203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.626223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.626256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.626281 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.645360 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.645615 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.648280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.648324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.648336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.648356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.648370 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.661770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.661816 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.661849 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.661785 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.661972 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.662133 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.662278 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.662362 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.756396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.756461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.756474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.756526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.756541 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.860396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.860498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.860528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.860559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.860610 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.894074 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.894314 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:39:15 crc kubenswrapper[4954]: E1127 16:39:15.894434 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:39:47.894404696 +0000 UTC m=+99.911845026 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.963481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.963542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.963563 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.963619 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:15 crc kubenswrapper[4954]: I1127 16:39:15.963640 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:15Z","lastTransitionTime":"2025-11-27T16:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.066816 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.067512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.067533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.067553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.067565 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.170323 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.170384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.170397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.170417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.170430 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.275315 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.275369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.275379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.275401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.275415 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.378854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.378904 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.378913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.378931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.378946 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.482152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.482218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.482237 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.482264 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.482284 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.585935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.586034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.586060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.586100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.586134 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.690076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.690139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.690154 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.690175 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.690187 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.793818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.793881 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.793901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.793922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.793934 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.896860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.896918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.896928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.896947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:16 crc kubenswrapper[4954]: I1127 16:39:16.896959 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:16Z","lastTransitionTime":"2025-11-27T16:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.000007 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.000051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.000060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.000078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.000090 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.102522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.102630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.102652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.102684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.102704 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.202574 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/0.log" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.202706 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5bda3ef-ba2c-424a-ba4a-432053d1c40d" containerID="3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d" exitCode=1 Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.202771 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerDied","Data":"3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.203453 4954 scope.go:117] "RemoveContainer" containerID="3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.206923 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.206965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.206978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.207000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.207013 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.223209 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.244185 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.277768 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.294852 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.310570 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.310657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.310696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.310719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.310733 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.311311 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.338764 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.354628 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.369130 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.383278 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.398094 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.413110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.413173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.413189 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.413213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.413227 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.418005 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.435653 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.452941 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.467489 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.489986 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.506848 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.516445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.516505 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.516524 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.516551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.516573 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.524974 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.619788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.619834 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.619845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.619864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.619877 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.662215 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.662285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:17 crc kubenswrapper[4954]: E1127 16:39:17.662486 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.662537 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.662501 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:17 crc kubenswrapper[4954]: E1127 16:39:17.662751 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:17 crc kubenswrapper[4954]: E1127 16:39:17.662931 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:17 crc kubenswrapper[4954]: E1127 16:39:17.663044 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.723238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.723287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.723299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.723320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.723332 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.825412 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.825472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.825487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.825513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.825534 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.928390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.928438 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.928451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.928469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:17 crc kubenswrapper[4954]: I1127 16:39:17.928481 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:17Z","lastTransitionTime":"2025-11-27T16:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.031481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.031535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.031548 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.031567 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.031596 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.134917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.134964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.134977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.135000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.135013 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.208974 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/0.log" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.209082 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerStarted","Data":"bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.225562 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.238050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.238110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.238124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.238148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.238166 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.240236 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.255246 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.286544 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.306982 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.327615 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.341268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.341529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.341719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.341866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.342038 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.342664 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.355877 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.372628 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.387120 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.401537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.415486 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.431833 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.444459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.444509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.444520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.444540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.444552 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.449653 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.465472 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.479120 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.498023 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.547732 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.547838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.547857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.547927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.547948 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.650472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.650518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.650528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.650545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.650555 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.662763 4954 scope.go:117] "RemoveContainer" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.687375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.707556 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.728576 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.744192 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.752931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.753066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.753154 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.753226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.753284 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.768921 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.788812 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.806875 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.842863 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.856066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.856146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.856159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.856205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.856240 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.857038 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.879636 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.896865 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.907833 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.925375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.939130 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.954248 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.958137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.958178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.958188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.958208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.958221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:18Z","lastTransitionTime":"2025-11-27T16:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.972149 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:18 crc kubenswrapper[4954]: I1127 16:39:18.987864 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.061534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.061612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.061625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.061645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.061658 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.164416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.164466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.164478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.164497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.164509 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.214825 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/2.log" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.217897 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.219096 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.236654 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.248947 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.266729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.266765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.266776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.266791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.266804 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.273399 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.287187 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.302256 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.314404 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.329716 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.347754 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.367685 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.368554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.368606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.368622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.368642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.368657 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.389121 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.402283 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.413032 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.424409 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.434680 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.448623 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.460661 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.470831 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.470874 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.470891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.470914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.470930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.474088 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.573848 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.573907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.573916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.573933 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.573944 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.661210 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.661255 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.661265 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:19 crc kubenswrapper[4954]: E1127 16:39:19.661351 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:19 crc kubenswrapper[4954]: E1127 16:39:19.661476 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.661560 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:19 crc kubenswrapper[4954]: E1127 16:39:19.661636 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:19 crc kubenswrapper[4954]: E1127 16:39:19.661787 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.676266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.676332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.676345 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.676368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.676379 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.779104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.779197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.779215 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.779235 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.779249 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.882001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.882059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.882069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.882088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.882099 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.985054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.985124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.985142 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.985168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:19 crc kubenswrapper[4954]: I1127 16:39:19.985187 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:19Z","lastTransitionTime":"2025-11-27T16:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.088902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.088979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.088999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.089026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.089045 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.192247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.192307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.192328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.192354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.192372 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.224879 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/3.log" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.226224 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/2.log" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.230756 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" exitCode=1 Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.230826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.230878 4954 scope.go:117] "RemoveContainer" containerID="a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.233850 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:39:20 crc kubenswrapper[4954]: E1127 16:39:20.234250 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.254343 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.271044 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.294413 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a064652ba1f70f1ee05a75805f65a7847485fc0552afd53a9776ae05da2f5368\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:38:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:38:55.699330 6572 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1127 16:38:55.699374 6572 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1127 16:38:55.699414 6572 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1127 16:38:55.699503 6572 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 16:38:55.699560 6572 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 16:38:55.699876 6572 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 16:38:55.699985 6572 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 16:38:55.700033 6572 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:38:55.700080 6572 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 16:38:55.700174 6572 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:19Z\\\",\\\"message\\\":\\\"861940b962e7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:39:19.634568 6893 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-webhook per-node LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634517 6893 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1127 16:39:19.634598 6893 services_controller.go:453] Built service openshift-machine-api/machine-api-operator-webhook template LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634175 6893 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 16:39:19.634607 6893 services_controller.go:454] Service openshift-machine-api/machine-api-operator-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1127 16:39:19.634157 6893 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1127 16:39:19.634674 6893 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initializa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.303541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.303682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.303720 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.303759 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.303786 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.318215 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.337290 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.351933 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.368785 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.381357 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.399366 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.406400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.406564 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.406661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.406765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.406834 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.416931 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.428731 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.445524 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.459007 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.472143 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.486811 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.503449 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.509472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.509503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.509514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.509782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.509809 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.523569 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.613301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.613374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.613400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.613431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.613454 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.716307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.716384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.716412 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.716446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.716473 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.819226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.819274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.819284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.819305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.819316 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.921550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.921669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.921686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.921718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:20 crc kubenswrapper[4954]: I1127 16:39:20.921739 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:20Z","lastTransitionTime":"2025-11-27T16:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.025476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.025546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.025561 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.025616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.025638 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.129543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.129644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.129663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.129688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.129704 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.232296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.232392 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.232418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.232456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.232486 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.237826 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/3.log" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.245552 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:39:21 crc kubenswrapper[4954]: E1127 16:39:21.245875 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.265001 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.277143 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.290823 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.310858 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.324395 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.336791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.336862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.336879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.336911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.336933 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.341992 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.361929 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.379693 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.399781 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.418266 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.433076 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.439932 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.440008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.440033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.440064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.440087 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.452179 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.470606 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.490257 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.507501 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.532695 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:19Z\\\",\\\"message\\\":\\\"861940b962e7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:39:19.634568 6893 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-webhook per-node LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634517 6893 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1127 16:39:19.634598 6893 services_controller.go:453] Built service openshift-machine-api/machine-api-operator-webhook template LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634175 6893 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 16:39:19.634607 6893 services_controller.go:454] Service openshift-machine-api/machine-api-operator-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1127 16:39:19.634157 6893 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1127 16:39:19.634674 6893 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initializa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.543300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.543350 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.543370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.543397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.543416 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.548213 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.645591 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.645646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.645657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.645677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.645691 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.661862 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:21 crc kubenswrapper[4954]: E1127 16:39:21.662039 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.662297 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:21 crc kubenswrapper[4954]: E1127 16:39:21.662398 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.662639 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:21 crc kubenswrapper[4954]: E1127 16:39:21.662738 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.662755 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:21 crc kubenswrapper[4954]: E1127 16:39:21.662909 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.748557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.748622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.748632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.748648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.748658 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.851734 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.851846 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.851900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.851930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.851948 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.954959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.955043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.955064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.955093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:21 crc kubenswrapper[4954]: I1127 16:39:21.955115 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:21Z","lastTransitionTime":"2025-11-27T16:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.058324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.058399 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.058426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.058459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.058483 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.162177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.162238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.162260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.162288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.162308 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.266424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.266481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.266498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.266523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.266539 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.369885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.369959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.369976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.370005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.370024 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.472667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.472707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.472716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.472732 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.472742 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.575226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.575292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.575315 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.575344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.575369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.678194 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.678242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.678258 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.678279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.678300 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.782254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.782320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.782336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.782362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.782381 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.885559 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.885666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.885689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.885788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.885819 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.988993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.989072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.989096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.989127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:22 crc kubenswrapper[4954]: I1127 16:39:22.989149 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:22Z","lastTransitionTime":"2025-11-27T16:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.092281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.092345 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.092363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.092385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.092399 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.196485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.196574 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.196620 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.196652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.196673 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.300562 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.300698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.300720 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.300749 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.300774 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.404844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.404915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.404936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.404966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.404984 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.508187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.508313 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.508379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.508414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.508506 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.611771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.611841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.611861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.611887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.611908 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.661913 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.661999 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.662011 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:23 crc kubenswrapper[4954]: E1127 16:39:23.662157 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.662241 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:23 crc kubenswrapper[4954]: E1127 16:39:23.662331 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:23 crc kubenswrapper[4954]: E1127 16:39:23.662467 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:23 crc kubenswrapper[4954]: E1127 16:39:23.662550 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.715569 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.715684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.715703 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.715733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.715754 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.818910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.818987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.819013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.819052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.819076 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.921974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.922026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.922043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.922068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:23 crc kubenswrapper[4954]: I1127 16:39:23.922085 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:23Z","lastTransitionTime":"2025-11-27T16:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.025541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.025626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.025650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.025684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.025701 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.128999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.129047 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.129064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.129088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.129105 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.232114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.232556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.232831 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.233056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.233264 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.337205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.337893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.337974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.338012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.338037 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.441057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.441113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.441133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.441158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.441177 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.545302 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.545361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.545380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.545405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.545422 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.648712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.648772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.648798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.648827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.648849 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.751803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.751863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.751881 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.751904 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.751923 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.855738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.855805 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.855829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.855859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.855881 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.958551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.958633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.958659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.958689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:24 crc kubenswrapper[4954]: I1127 16:39:24.958710 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:24Z","lastTransitionTime":"2025-11-27T16:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.061653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.061719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.061742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.061769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.061791 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.165463 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.165569 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.165627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.165660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.165685 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.269525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.269663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.269689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.269723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.269753 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.372875 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.372943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.372960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.372986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.373009 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.476263 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.476333 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.476363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.476388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.476410 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.580155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.580233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.580253 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.580288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.580313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.661567 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.661700 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.661953 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.662014 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:25 crc kubenswrapper[4954]: E1127 16:39:25.662174 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:25 crc kubenswrapper[4954]: E1127 16:39:25.662293 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:25 crc kubenswrapper[4954]: E1127 16:39:25.662392 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:25 crc kubenswrapper[4954]: E1127 16:39:25.662508 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.677433 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.682888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.683142 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.683287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.683466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.683663 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.788084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.788163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.788177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.788200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.788215 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.891366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.891425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.891442 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.891466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.891483 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.965307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.965420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.965441 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.965472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.965497 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:25 crc kubenswrapper[4954]: E1127 16:39:25.986261 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.992470 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.992523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.992543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.992608 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:25 crc kubenswrapper[4954]: I1127 16:39:25.992629 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:25Z","lastTransitionTime":"2025-11-27T16:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: E1127 16:39:26.013101 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:26Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.019105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.019201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.019225 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.019259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.019281 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: E1127 16:39:26.040388 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:26Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.046395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.046455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.046468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.046492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.046513 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: E1127 16:39:26.067044 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:26Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.073771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.073844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.073859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.073884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.073897 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: E1127 16:39:26.094756 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:26Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:26 crc kubenswrapper[4954]: E1127 16:39:26.094977 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.104434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.104518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.104540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.104573 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.104636 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.207746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.207813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.207832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.207925 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.207948 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.311526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.311653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.311681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.311718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.311743 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.416443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.416517 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.416537 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.416567 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.416622 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.519664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.519752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.519776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.519806 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.519827 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.623633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.623732 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.623756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.623785 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.623804 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.727711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.727783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.727820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.727855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.727882 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.831423 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.831503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.831522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.831553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.831575 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.935622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.935684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.935700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.935724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:26 crc kubenswrapper[4954]: I1127 16:39:26.935742 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:26Z","lastTransitionTime":"2025-11-27T16:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.038012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.038049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.038057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.038073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.038082 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.141318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.141388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.141409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.141436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.141457 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.245461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.245499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.245509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.245530 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.245542 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.349185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.349293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.349315 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.349382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.349413 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.452449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.452534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.452561 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.452633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.452655 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.556291 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.556386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.556411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.556444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.556468 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.659919 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.659996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.660017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.660048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.660068 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.661537 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.661571 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.661576 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.661537 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:27 crc kubenswrapper[4954]: E1127 16:39:27.661774 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:27 crc kubenswrapper[4954]: E1127 16:39:27.661883 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:27 crc kubenswrapper[4954]: E1127 16:39:27.662063 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:27 crc kubenswrapper[4954]: E1127 16:39:27.662237 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.763387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.763462 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.763481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.763513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.763536 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.866822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.866896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.866920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.866953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.866975 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.970513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.970635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.970664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.970698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:27 crc kubenswrapper[4954]: I1127 16:39:27.970729 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:27Z","lastTransitionTime":"2025-11-27T16:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.074671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.074748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.074767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.074796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.074816 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.178139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.178218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.178235 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.178263 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.178285 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.280863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.281014 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.281039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.281069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.281090 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.384468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.384551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.384564 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.384618 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.384637 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.490421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.490515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.490536 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.490647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.490674 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.593843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.593945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.593972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.594002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.594026 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.689878 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.697915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.697981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.698000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.698028 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.698052 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.717617 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.738386 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.761946 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.783181 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.800629 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.800665 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.800673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.800689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.800701 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.803493 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.826769 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.846467 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.869764 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.891897 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.903942 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.904019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.904038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.904070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.904090 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:28Z","lastTransitionTime":"2025-11-27T16:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.913387 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.931015 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.959925 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:19Z\\\",\\\"message\\\":\\\"861940b962e7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:39:19.634568 6893 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-webhook per-node LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634517 6893 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1127 16:39:19.634598 6893 services_controller.go:453] Built service openshift-machine-api/machine-api-operator-webhook template LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634175 6893 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 16:39:19.634607 6893 services_controller.go:454] Service openshift-machine-api/machine-api-operator-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1127 16:39:19.634157 6893 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1127 16:39:19.634674 6893 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initializa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.975104 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:28 crc kubenswrapper[4954]: I1127 16:39:28.999132 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:28Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.006702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.006765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.006785 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.006813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.006830 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.018856 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.038917 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.057447 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a06b3afb-c8f3-4fc2-aa82-f5b20f275a0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://589ee698e003ae1938fae963deb0288be15549fc6efd55fb72e0d40ee3ca325d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:29Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.110368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.110555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.110752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.110786 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.110860 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.214709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.214784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.214803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.214834 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.214864 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.318544 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.318660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.318680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.318715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.318736 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.422598 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.422654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.422667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.422686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.422698 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.526174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.526255 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.526273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.526309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.526401 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.629820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.629908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.629927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.629954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.629973 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.661126 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:29 crc kubenswrapper[4954]: E1127 16:39:29.661361 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.661793 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:29 crc kubenswrapper[4954]: E1127 16:39:29.661952 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.662231 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:29 crc kubenswrapper[4954]: E1127 16:39:29.662371 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.662680 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:29 crc kubenswrapper[4954]: E1127 16:39:29.662825 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.733534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.733677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.733708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.733737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.733758 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.837008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.837063 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.837120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.837145 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.837163 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.941259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.941373 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.941403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.941453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:29 crc kubenswrapper[4954]: I1127 16:39:29.941485 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:29Z","lastTransitionTime":"2025-11-27T16:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.049786 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.049842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.049862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.049887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.049903 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.153733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.153821 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.153839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.153865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.153882 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.257979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.258043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.258060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.258085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.258104 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.363103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.363179 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.363198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.363229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.363248 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.466291 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.466363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.466385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.466411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.466435 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.569748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.569803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.569818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.569841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.569859 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.672550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.672667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.672696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.672727 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.672751 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.777129 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.777196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.777214 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.777238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.777255 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.880824 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.880894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.880917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.880947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.880969 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.984431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.984493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.984507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.984532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:30 crc kubenswrapper[4954]: I1127 16:39:30.984548 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:30Z","lastTransitionTime":"2025-11-27T16:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.088200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.088252 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.088269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.088296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.088314 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.192296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.192405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.192431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.192457 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.192474 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.295161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.295451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.295459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.295474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.295488 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.398483 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.398612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.398654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.398690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.398717 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.501943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.502022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.502047 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.502079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.502103 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.605121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.605164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.605175 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.605194 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.605207 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.661901 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.661995 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.662023 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.662195 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.662230 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.662396 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.662474 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.662668 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.707958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.708043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.708063 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.708092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.708111 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.811123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.811209 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.811230 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.811261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.811281 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.915222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.915306 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.915325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.915353 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.915421 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:31Z","lastTransitionTime":"2025-11-27T16:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.981043 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.981290 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.981240403 +0000 UTC m=+147.998680733 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:39:31 crc kubenswrapper[4954]: I1127 16:39:31.981378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.981720 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:39:31 crc kubenswrapper[4954]: E1127 16:39:31.981860 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.981822327 +0000 UTC m=+147.999262677 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.018986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.019074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.019094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.019125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.019146 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.083267 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.083354 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.083391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083644 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083675 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083690 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083728 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083787 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083833 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083856 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083764 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.08374497 +0000 UTC m=+148.101185270 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.083969 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.083918104 +0000 UTC m=+148.101358564 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.084040 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.084010946 +0000 UTC m=+148.101451506 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.122275 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.122328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.122344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.122364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.122378 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.225887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.225964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.225992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.226023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.226045 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.329008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.329084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.329107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.329133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.329153 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.431926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.431986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.431998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.432020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.432035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.535695 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.535746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.535757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.535775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.535788 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.639070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.639131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.639144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.639166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.639180 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.663930 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:39:32 crc kubenswrapper[4954]: E1127 16:39:32.664338 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.742420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.742464 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.742476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.742497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.742510 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.846190 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.846254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.846273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.846301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.846316 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.950316 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.950388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.950409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.950469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:32 crc kubenswrapper[4954]: I1127 16:39:32.950535 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:32Z","lastTransitionTime":"2025-11-27T16:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.053839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.053912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.053932 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.053964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.053984 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.158026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.158093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.158108 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.158133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.158150 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.261056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.261120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.261137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.261162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.261179 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.364383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.364456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.364507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.364544 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.364567 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.467602 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.467662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.467679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.467702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.467719 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.571038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.571118 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.571135 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.571161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.571179 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.661700 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.661743 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.661725 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.661716 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:33 crc kubenswrapper[4954]: E1127 16:39:33.661881 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:33 crc kubenswrapper[4954]: E1127 16:39:33.661972 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:33 crc kubenswrapper[4954]: E1127 16:39:33.662069 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:33 crc kubenswrapper[4954]: E1127 16:39:33.662175 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.673636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.673688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.673707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.673731 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.673749 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.777078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.777158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.777176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.777201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.777218 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.880823 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.880897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.880915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.880944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.880965 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.984878 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.984976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.984996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.985027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:33 crc kubenswrapper[4954]: I1127 16:39:33.985047 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:33Z","lastTransitionTime":"2025-11-27T16:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.089186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.089251 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.089267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.089293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.089313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.192644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.192721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.192740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.192767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.192784 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.296071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.296159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.296183 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.296218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.296244 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.400299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.400390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.400410 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.400503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.400530 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.503886 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.503964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.503983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.504013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.504036 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.607545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.607670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.607689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.607719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.607742 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.712015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.712123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.712153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.712187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.712222 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.817928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.817983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.817999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.818024 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.818042 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.920639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.920693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.920712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.920738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:34 crc kubenswrapper[4954]: I1127 16:39:34.920756 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:34Z","lastTransitionTime":"2025-11-27T16:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.024402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.024457 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.024476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.024502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.024520 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.128195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.128304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.128323 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.128347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.128365 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.231758 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.231840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.231865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.231898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.231923 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.335186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.335279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.335301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.335332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.335357 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.439347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.439417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.439435 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.439463 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.439484 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.542521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.542650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.542681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.542728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.542751 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.646807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.646914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.646939 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.647003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.647023 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.661456 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.661501 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.661545 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.661662 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:35 crc kubenswrapper[4954]: E1127 16:39:35.661796 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:35 crc kubenswrapper[4954]: E1127 16:39:35.662022 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:35 crc kubenswrapper[4954]: E1127 16:39:35.662122 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:35 crc kubenswrapper[4954]: E1127 16:39:35.662234 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.750007 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.750214 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.750257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.750303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.750334 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.853957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.854034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.854054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.854079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.854096 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.957096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.957176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.957200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.957232 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:35 crc kubenswrapper[4954]: I1127 16:39:35.957251 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:35Z","lastTransitionTime":"2025-11-27T16:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.061566 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.061667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.061685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.061712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.061729 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.165236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.165306 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.165324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.165352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.165369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.269114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.269183 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.269214 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.269255 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.269278 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.334549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.334660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.334685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.334714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.334733 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.356292 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.361837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.361915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.361934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.361962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.361980 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.382269 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.388910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.388977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.388997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.389025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.389050 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.411325 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.417507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.417621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.417646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.417676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.417694 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.441387 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.447226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.447289 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.447307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.447334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.447351 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.468152 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:36Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:36 crc kubenswrapper[4954]: E1127 16:39:36.468265 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.476005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.476104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.476125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.476152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.476176 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.579682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.579736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.579752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.579775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.579791 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.682766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.682849 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.682867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.683303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.683355 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.786856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.786908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.786926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.786951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.786970 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.890104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.890155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.890166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.890185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.890198 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.993172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.993279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.993298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.993321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:36 crc kubenswrapper[4954]: I1127 16:39:36.993339 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:36Z","lastTransitionTime":"2025-11-27T16:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.096713 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.096810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.096828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.096854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.096874 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.200419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.200477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.200494 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.200525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.200548 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.303934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.303993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.304008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.304033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.304048 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.407176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.407236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.407245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.407262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.407272 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.510236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.510276 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.510287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.510306 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.510318 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.613880 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.613961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.613987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.614020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.614044 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.661730 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.661792 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.661942 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:37 crc kubenswrapper[4954]: E1127 16:39:37.661957 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.662002 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:37 crc kubenswrapper[4954]: E1127 16:39:37.662144 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:37 crc kubenswrapper[4954]: E1127 16:39:37.662246 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:37 crc kubenswrapper[4954]: E1127 16:39:37.662325 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.717333 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.717379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.717395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.717421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.717441 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.820083 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.820140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.820158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.820184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.820201 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.923842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.923913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.923935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.923964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:37 crc kubenswrapper[4954]: I1127 16:39:37.923984 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:37Z","lastTransitionTime":"2025-11-27T16:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.027331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.027402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.027424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.027448 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.027465 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.131907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.131994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.132014 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.132051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.132075 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.235859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.235907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.235918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.235940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.235953 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.338951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.338993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.339004 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.339020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.339032 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.443079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.443138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.443158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.443185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.443205 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.546572 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.546737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.546759 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.546790 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.546817 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.650639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.650695 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.650711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.650737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.650756 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.680042 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a06b3afb-c8f3-4fc2-aa82-f5b20f275a0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://589ee698e003ae1938fae963deb0288be15549fc6efd55fb72e0d40ee3ca325d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.702734 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.719997 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.743269 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.754143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.754196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.754213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.754240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.755746 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.761539 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.780214 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.809303 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.830350 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.843237 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.860806 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.860862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.860882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.860908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.860925 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.861740 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.878502 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.898477 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.914799 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.937241 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.968883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.968950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.968971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.969000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.969018 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:38Z","lastTransitionTime":"2025-11-27T16:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.976328 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:38 crc kubenswrapper[4954]: I1127 16:39:38.997151 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:38Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.022990 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:19Z\\\",\\\"message\\\":\\\"861940b962e7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:39:19.634568 6893 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-webhook per-node LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634517 6893 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1127 16:39:19.634598 6893 services_controller.go:453] Built service openshift-machine-api/machine-api-operator-webhook template LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634175 6893 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 16:39:19.634607 6893 services_controller.go:454] Service openshift-machine-api/machine-api-operator-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1127 16:39:19.634157 6893 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1127 16:39:19.634674 6893 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initializa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:39Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.033365 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:39Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.072124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.072206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.072240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.072271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.072315 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.175280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.175335 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.175345 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.175363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.175376 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.278572 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.278700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.278726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.278760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.278783 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.381702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.381767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.381784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.381835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.381855 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.485156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.485234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.485256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.485287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.485313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.588863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.588930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.588954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.588984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.589005 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.661051 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.661128 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.661171 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.661430 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:39 crc kubenswrapper[4954]: E1127 16:39:39.661420 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:39 crc kubenswrapper[4954]: E1127 16:39:39.661648 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:39 crc kubenswrapper[4954]: E1127 16:39:39.661763 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:39 crc kubenswrapper[4954]: E1127 16:39:39.661897 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.691844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.691929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.691956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.691990 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.692015 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.795165 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.795236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.795253 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.795281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.795299 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.898371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.898439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.898455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.898481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:39 crc kubenswrapper[4954]: I1127 16:39:39.898499 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:39Z","lastTransitionTime":"2025-11-27T16:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.001245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.001327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.001346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.001372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.001389 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.105019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.105132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.105164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.105201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.105227 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.208018 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.208073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.208085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.208103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.208115 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.311644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.311733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.311757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.311789 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.311811 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.420384 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.420447 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.420461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.420484 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.420504 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.524793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.524858 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.524879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.524905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.524924 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.628787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.628835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.628848 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.628864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.628875 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.731981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.732054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.732074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.732098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.732118 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.835042 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.835110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.835138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.835172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.835196 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.938491 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.938566 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.938655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.938716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:40 crc kubenswrapper[4954]: I1127 16:39:40.938763 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:40Z","lastTransitionTime":"2025-11-27T16:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.042465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.042558 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.042613 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.042651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.042710 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.145713 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.145783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.145805 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.145839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.145859 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.248449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.248500 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.248516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.248539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.248556 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.351840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.351916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.351933 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.351962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.351983 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.455428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.455493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.455515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.455546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.455570 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.558838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.558916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.558935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.558999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.559022 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.661092 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.661251 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:41 crc kubenswrapper[4954]: E1127 16:39:41.661416 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.661482 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.661534 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:41 crc kubenswrapper[4954]: E1127 16:39:41.661722 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:41 crc kubenswrapper[4954]: E1127 16:39:41.661829 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:41 crc kubenswrapper[4954]: E1127 16:39:41.661950 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.662748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.662807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.662827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.662854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.662872 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.766796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.766867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.766888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.766917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.766935 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.870800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.870867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.870886 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.870913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.870930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.974516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.974616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.974634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.974663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:41 crc kubenswrapper[4954]: I1127 16:39:41.974684 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:41Z","lastTransitionTime":"2025-11-27T16:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.077557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.077663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.077681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.077708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.077733 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.180561 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.180671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.180693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.180722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.180741 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.283982 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.284163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.284201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.284233 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.284253 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.388379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.388455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.388473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.388499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.388522 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.491843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.491899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.491915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.491942 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.491963 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.594700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.594767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.594785 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.594811 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.594828 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.697391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.697451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.697478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.697509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.697531 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.800677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.800745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.800762 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.800788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.800808 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.904205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.904263 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.904279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.904303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:42 crc kubenswrapper[4954]: I1127 16:39:42.904321 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:42Z","lastTransitionTime":"2025-11-27T16:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.007332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.007400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.007424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.007454 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.007475 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.117082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.117142 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.117161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.117187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.117213 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.220647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.220699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.220707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.220724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.220734 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.324322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.324390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.324415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.324445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.324466 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.427477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.427535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.427547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.427569 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.427600 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.531571 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.531708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.531733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.531773 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.531799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.635032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.635112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.635128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.635165 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.635182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.661449 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.661571 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:43 crc kubenswrapper[4954]: E1127 16:39:43.661681 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:43 crc kubenswrapper[4954]: E1127 16:39:43.661822 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.661943 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.662121 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:43 crc kubenswrapper[4954]: E1127 16:39:43.662190 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:43 crc kubenswrapper[4954]: E1127 16:39:43.662394 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.738013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.738056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.738069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.738089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.738101 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.842282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.843055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.843088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.843122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.843145 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.947010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.947076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.947095 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.947113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:43 crc kubenswrapper[4954]: I1127 16:39:43.947125 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:43Z","lastTransitionTime":"2025-11-27T16:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.049943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.050008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.050030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.050060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.050083 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.153696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.153762 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.153780 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.153808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.153826 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.256262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.256304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.256314 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.256334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.256344 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.360325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.360414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.360474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.360511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.360535 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.464521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.464616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.464629 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.464651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.464665 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.568432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.568505 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.568518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.568615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.568639 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.663822 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:39:44 crc kubenswrapper[4954]: E1127 16:39:44.664151 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.671198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.671255 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.671268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.671288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.671302 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.774869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.774973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.774996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.775033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.775056 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.878215 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.878297 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.878319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.878351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.878370 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.981437 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.981486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.981502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.981526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:44 crc kubenswrapper[4954]: I1127 16:39:44.981544 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:44Z","lastTransitionTime":"2025-11-27T16:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.093653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.093721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.093738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.093764 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.093782 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.196844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.196962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.196981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.197013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.197035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.300522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.301074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.301242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.301406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.301537 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.405168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.405611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.405847 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.405997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.406126 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.509204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.509660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.509829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.510005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.510142 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.614304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.614371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.614390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.614422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.614440 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.661667 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.661760 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:45 crc kubenswrapper[4954]: E1127 16:39:45.661912 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.662216 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.662210 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:45 crc kubenswrapper[4954]: E1127 16:39:45.662385 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:45 crc kubenswrapper[4954]: E1127 16:39:45.663816 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:45 crc kubenswrapper[4954]: E1127 16:39:45.664940 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.718009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.718078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.718097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.718126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.718145 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.821604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.821901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.822098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.822262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.822396 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.925905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.926354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.926570 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.926857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:45 crc kubenswrapper[4954]: I1127 16:39:45.927096 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:45Z","lastTransitionTime":"2025-11-27T16:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.031276 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.031381 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.031408 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.031449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.031476 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.134407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.134948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.134997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.135027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.135048 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.239020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.239098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.239119 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.239149 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.239170 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.343196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.343469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.343495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.343526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.343545 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.447784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.447851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.447871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.447897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.447920 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.551813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.551888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.551906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.551936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.551959 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.655445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.655525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.655546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.655575 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.655623 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.756948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.757048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.757073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.757107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.757133 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.786042 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:46Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.792114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.792177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.792198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.792224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.792247 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.816192 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:46Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.821653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.821692 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.821704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.821751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.821764 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.839496 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:46Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.844460 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.844533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.844553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.844609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.844630 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.866376 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:46Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.871238 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.871281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.871299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.871321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.871345 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.892651 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:46Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:46 crc kubenswrapper[4954]: E1127 16:39:46.892882 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.894760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.894809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.894825 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.894845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.894864 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.998166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.998229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.998245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.998273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:46 crc kubenswrapper[4954]: I1127 16:39:46.998292 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:46Z","lastTransitionTime":"2025-11-27T16:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.101929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.101984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.102003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.102032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.102053 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.204636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.204718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.204749 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.204787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.204806 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.309033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.309104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.309123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.309154 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.309182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.412984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.413062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.413084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.413115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.413138 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.516501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.516572 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.516622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.516657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.516682 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.620477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.620538 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.620548 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.620566 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.620602 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.661488 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.661499 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.661773 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.661499 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.661521 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.661888 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.662199 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.662284 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.723918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.723972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.723984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.724003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.724017 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.827128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.827184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.827201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.827229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.827248 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.931540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.931657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.931679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.931710 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.931731 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:47Z","lastTransitionTime":"2025-11-27T16:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:47 crc kubenswrapper[4954]: I1127 16:39:47.976894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.977160 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:39:47 crc kubenswrapper[4954]: E1127 16:39:47.977296 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs podName:af5183f4-5f46-4d64-8ec4-c7b71530cad6 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:51.977264079 +0000 UTC m=+163.994704419 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs") pod "network-metrics-daemon-hgsvh" (UID: "af5183f4-5f46-4d64-8ec4-c7b71530cad6") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.035155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.035567 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.035733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.035871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.035969 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.139335 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.139451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.139481 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.139522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.139553 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.242405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.242474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.242484 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.242502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.242511 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.345859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.345909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.345921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.345940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.345952 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.449542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.449699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.449728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.449753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.449769 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.553874 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.553936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.553958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.553982 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.553999 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.657260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.657322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.657334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.657356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.657370 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.682537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a80574-7c60-4f19-985b-3ee313cb7bcd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3bfedfcafb3316fee81a8d1a6d9e4d8c530b7bbb10193341d5021a5acbbfe4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwzjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-699qq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.717679 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9c365fc-0cba-4fcf-b721-30de2b908a56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:19Z\\\",\\\"message\\\":\\\"861940b962e7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:39:19.634568 6893 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-webhook per-node LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634517 6893 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1127 16:39:19.634598 6893 services_controller.go:453] Built service openshift-machine-api/machine-api-operator-webhook template LB for network=default: []services.LB{}\\\\nI1127 16:39:19.634175 6893 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 16:39:19.634607 6893 services_controller.go:454] Service openshift-machine-api/machine-api-operator-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1127 16:39:19.634157 6893 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1127 16:39:19.634674 6893 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initializa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27hxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5zbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.733860 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af5183f4-5f46-4d64-8ec4-c7b71530cad6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s6vq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgsvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.752776 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.760191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.760272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.760282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.760321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.760333 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.770796 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.785539 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lt9bl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f164460-f6b2-4383-9e5e-f4d0045d9690\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc3d4b30c41f8bbff3623b037109b7faca9e2438dfe7240a4fbf3c8fb8c27bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b56lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lt9bl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.800620 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a06b3afb-c8f3-4fc2-aa82-f5b20f275a0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://589ee698e003ae1938fae963deb0288be15549fc6efd55fb72e0d40ee3ca325d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2186713e39ca754bb90eb1f84bc523cef94288510d11244c45267085d2f9918\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.822034 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b7cd63-bb9a-4c77-b67a-e72adc26393a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:38:22Z\\\",\\\"message\\\":\\\"W1127 16:38:11.939802 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 16:38:11.940051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764261491 cert, and key in /tmp/serving-cert-2393175808/serving-signer.crt, /tmp/serving-cert-2393175808/serving-signer.key\\\\nI1127 16:38:12.073962 1 observer_polling.go:159] Starting file observer\\\\nW1127 16:38:12.077982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 16:38:12.078373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:38:12.081926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2393175808/tls.crt::/tmp/serving-cert-2393175808/tls.key\\\\\\\"\\\\nF1127 16:38:22.478599 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.842852 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4633bf6a24c281dffedb23b6efec6dff41b512ca353a31a32c3988b523b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.857375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-27v67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df79f3c-9df0-48a0-980f-10ecadf5efd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80589bef6eb84e30399c60ede88844c7917afc5bc0a051e33ac307de7670ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn2f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-27v67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.863289 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.863367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.863391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.863422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.863445 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.875398 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9mb96" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5bda3ef-ba2c-424a-ba4a-432053d1c40d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:39:17Z\\\",\\\"message\\\":\\\"2025-11-27T16:38:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550\\\\n2025-11-27T16:38:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43baba0c-e068-4cf6-a5a0-98de61c3f550 to /host/opt/cni/bin/\\\\n2025-11-27T16:38:32Z [verbose] multus-daemon started\\\\n2025-11-27T16:38:32Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:39:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r96jj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9mb96\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.889198 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"474d40a8-ea36-4785-8818-6beb58074208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711fd0edfdc1fc0465c22fd73cdce98005c371cb4a4662314c051add365cc3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75f0d7911572bda6bd48f347e24cddeea563f23cf84a4abd69f961b576999119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rcvbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j2bxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.904186 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4942b2dc-bb0b-485a-84d6-eeaaaa834d91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94982bc29f0ee44235509ce47bb0790994962a450b2e27e418f351a3643d885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28bc02faf2534dbf38fbc116fb6b51a528297719f7de0f40d1c9374199391eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7209ac0080d25aaf9cfaba43b4cb35e5c36f015b52469a211b65f4a53a2dcd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11dd7fc77c9df494e9dac3fd605b1dc7a342fe3fe853a18260a68d29f82738e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.923999 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e11dee9902e47c6d0e972a3b8f86123252f000b875f7dff8af31db48e69503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.940048 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bd6ec80896ba1c7117ea88193af1f3b9aec353ab889d6864e0b221e4efdf428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72cc2fd437541de22aaa3130acadd5bd1eacd2e45ef0e12d55ce1877ac1965bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.959316 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.967264 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.967320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.967364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.967393 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.967413 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:48Z","lastTransitionTime":"2025-11-27T16:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.979744 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"536fc833-8add-426d-9ed0-b63547d316e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c35f962fb1464be093f6b3cc62d79b47d06468ed4c1885c42c1f3f49b911458b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93b4ad70a59e77b038862c2106a344273d5b450f30d8eed7879ed445edb5004c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de62dc655756c06be57a16b11fd6d9476904fdbdd1125d6e38c58558c591a90f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://112d4c6ada7735a4733d3fc03419b1039365e99d8d043a4ca63883fd430a1623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b01fffd25f777b482b34bdc06ee02b5e5bf567210a84f95a641a3873315ca988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26928f3001d280c15475bbdf4509f98ce6cb12fe3eec6095a36bae800d017e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ba48b4c6374dc6a999db9eb8f55e38d9e20d11be0cd6e74091c751a4afd685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bzf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cz8gx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:48 crc kubenswrapper[4954]: I1127 16:39:48.999762 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed7ac545-28d1-4c54-9952-4b7845b4a475\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5f6e2fcbd93a30e7357a367e184a6f5c6c1af83f618e0fd0d724e51ba71ea08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a1ddaf55a730a8e5a53ecff0eef2afd9786d3f249ac18b7b3e3e6649b65fe45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6a464ca56934b2a1b4e31b921d34c3f57d9aacbd965746db957882d36527e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:38:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:48Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.070878 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.070997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.071020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.071056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.071082 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.174356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.174436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.174462 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.174492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.174513 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.278970 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.279059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.279078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.279102 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.279120 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.382673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.382734 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.382751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.382779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.382800 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.486243 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.486405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.486425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.486455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.486484 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.591059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.591125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.591137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.591180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.591195 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.661046 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.661243 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:49 crc kubenswrapper[4954]: E1127 16:39:49.661460 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.661519 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.661616 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:49 crc kubenswrapper[4954]: E1127 16:39:49.661802 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:49 crc kubenswrapper[4954]: E1127 16:39:49.661931 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:49 crc kubenswrapper[4954]: E1127 16:39:49.662086 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.694181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.694245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.694279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.694317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.694341 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.797808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.797872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.797885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.797907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.797923 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.902329 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.902409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.902427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.902456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:49 crc kubenswrapper[4954]: I1127 16:39:49.902472 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:49Z","lastTransitionTime":"2025-11-27T16:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.006831 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.006919 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.006947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.006982 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.007010 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.109854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.109936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.109953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.109980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.109999 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.213340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.213417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.213436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.213464 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.213485 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.316809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.316871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.316887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.316908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.316920 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.421064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.421240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.421260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.421290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.421312 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.524508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.524618 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.524639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.524670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.524691 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.628518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.628673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.628776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.628877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.628954 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.732738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.732821 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.732841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.732868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.732890 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.837425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.837521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.837542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.837997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.838380 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.942171 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.942252 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.942274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.942304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:50 crc kubenswrapper[4954]: I1127 16:39:50.942329 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:50Z","lastTransitionTime":"2025-11-27T16:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.047560 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.047690 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.047716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.047746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.047766 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.153014 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.153084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.153101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.153130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.153152 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.258388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.258471 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.258494 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.258530 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.258557 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.363672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.363728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.363742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.363761 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.363775 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.467819 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.467877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.467895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.467923 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.467943 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.571069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.571146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.571163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.571194 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.571213 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.661217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.661330 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.661248 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:51 crc kubenswrapper[4954]: E1127 16:39:51.661428 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:51 crc kubenswrapper[4954]: E1127 16:39:51.661615 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.661705 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:51 crc kubenswrapper[4954]: E1127 16:39:51.661710 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:51 crc kubenswrapper[4954]: E1127 16:39:51.661954 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.674446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.674519 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.674536 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.674555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.674570 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.777922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.777967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.777984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.778006 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.778024 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.881899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.881989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.882015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.882046 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.882065 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.984650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.984693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.984704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.984722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:51 crc kubenswrapper[4954]: I1127 16:39:51.984737 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:51Z","lastTransitionTime":"2025-11-27T16:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.087489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.087563 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.087615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.087643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.087661 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.191460 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.191535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.191551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.191613 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.191632 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.294508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.294568 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.294612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.294636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.294648 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.397300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.397353 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.397367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.397392 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.397410 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.500549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.500666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.500725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.500754 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.500771 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.637316 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.637416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.637437 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.637464 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.637480 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.682109 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.741324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.741396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.741418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.741446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.741466 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.845469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.845553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.845574 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.845645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.845670 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.948940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.948999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.949012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.949034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:52 crc kubenswrapper[4954]: I1127 16:39:52.949048 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:52Z","lastTransitionTime":"2025-11-27T16:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.052430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.052502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.052522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.052551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.052623 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.156851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.156937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.156951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.156973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.156986 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.259983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.260065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.260082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.260110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.260128 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.364603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.364672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.364685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.364708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.364726 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.476118 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.476208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.476221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.476433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.476445 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.579337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.579743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.579964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.580195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.580368 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.661653 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.661781 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.662164 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:53 crc kubenswrapper[4954]: E1127 16:39:53.662465 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.662623 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:53 crc kubenswrapper[4954]: E1127 16:39:53.662638 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:53 crc kubenswrapper[4954]: E1127 16:39:53.662758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:53 crc kubenswrapper[4954]: E1127 16:39:53.662873 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.683370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.683651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.683845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.683998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.684141 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.787673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.787752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.787772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.787798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.787817 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.891209 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.891571 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.891675 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.891794 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.891890 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.994644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.994715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.994740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.994771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:53 crc kubenswrapper[4954]: I1127 16:39:53.994800 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:53Z","lastTransitionTime":"2025-11-27T16:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.098940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.099016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.099043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.099077 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.099103 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.202905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.202983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.203003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.203033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.203053 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.306120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.306189 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.306201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.306221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.306233 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.409271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.409327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.409340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.409359 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.409371 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.512856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.512897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.512907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.512922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.512934 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.616272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.616338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.616358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.616385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.616403 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.720173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.720235 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.720256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.720281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.720299 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.823959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.824024 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.824041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.824067 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.824085 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.927200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.927262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.927280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.927311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:54 crc kubenswrapper[4954]: I1127 16:39:54.927406 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:54Z","lastTransitionTime":"2025-11-27T16:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.030536 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.030628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.030649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.030674 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.030692 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.133947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.134045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.134121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.134157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.134183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.237528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.237628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.237649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.237670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.237683 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.341958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.342012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.342020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.342037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.342048 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.445404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.445478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.445492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.445518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.445534 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.549059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.549125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.549144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.549173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.549192 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.652456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.652518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.652536 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.652572 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.652625 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.662076 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.662108 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.662302 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.662435 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:55 crc kubenswrapper[4954]: E1127 16:39:55.662509 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:55 crc kubenswrapper[4954]: E1127 16:39:55.662665 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:55 crc kubenswrapper[4954]: E1127 16:39:55.662889 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:55 crc kubenswrapper[4954]: E1127 16:39:55.663559 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.664028 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:39:55 crc kubenswrapper[4954]: E1127 16:39:55.664300 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5zbp_openshift-ovn-kubernetes(c9c365fc-0cba-4fcf-b721-30de2b908a56)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.756794 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.756862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.756880 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.756911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.756930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.860926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.860978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.860989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.861011 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.861028 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.965098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.965177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.965198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.965229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:55 crc kubenswrapper[4954]: I1127 16:39:55.965255 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:55Z","lastTransitionTime":"2025-11-27T16:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.070232 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.070299 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.070317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.070345 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.070367 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.173918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.173999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.174017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.174045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.174064 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.277341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.277411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.277472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.277501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.277521 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.389341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.389431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.389459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.389501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.389567 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.492968 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.493035 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.493051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.493075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.493093 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.596197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.596293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.596319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.596354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.596380 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.699092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.699162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.699181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.699207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.699224 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.802211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.802283 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.802310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.802342 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.802367 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.906057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.906123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.906144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.906178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:56 crc kubenswrapper[4954]: I1127 16:39:56.906202 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:56Z","lastTransitionTime":"2025-11-27T16:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.009025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.009090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.009102 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.009125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.009139 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.113501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.113620 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.113640 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.113672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.113692 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.170124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.170204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.170223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.170255 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.170276 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.194575 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.201680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.201896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.202044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.202186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.202313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.219071 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.224458 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.224809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.224967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.225075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.225187 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.246461 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.252311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.252417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.252434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.252458 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.252473 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.269988 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.275141 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.275209 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.275218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.275237 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.275248 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.296384 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"070a8e98-7cab-4ad3-b09c-67172438041d\\\",\\\"systemUUID\\\":\\\"03003ca2-7417-4e94-98d9-1cf03e475029\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:39:57Z is after 2025-08-24T17:21:41Z" Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.296560 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.299147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.299241 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.299257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.299283 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.299302 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.402496 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.402562 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.402617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.402655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.402681 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.505767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.506099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.506188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.506270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.506340 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.609425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.610286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.610386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.610480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.610592 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.661786 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.661910 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.661937 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.662104 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.662152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.662218 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.662519 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:57 crc kubenswrapper[4954]: E1127 16:39:57.662572 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.714084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.714504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.714707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.714883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.715035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.818625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.818710 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.818733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.818766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.818789 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.921911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.921975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.921993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.922016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:57 crc kubenswrapper[4954]: I1127 16:39:57.922035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:57Z","lastTransitionTime":"2025-11-27T16:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.025802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.026266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.026442 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.026647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.026799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.130669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.130728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.130745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.130772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.130791 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.233514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.233644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.233671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.233703 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.233728 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.337541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.337611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.337625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.337644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.337657 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.441700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.441783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.441809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.441842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.441870 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.544914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.544958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.544973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.544992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.545003 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.648289 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.648380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.648417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.648453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.648477 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.699626 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.699602973 podStartE2EDuration="1m25.699602973s" podCreationTimestamp="2025-11-27 16:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.699438679 +0000 UTC m=+110.716878999" watchObservedRunningTime="2025-11-27 16:39:58.699602973 +0000 UTC m=+110.717043263" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.754144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.754240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.754274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.754297 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.754311 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.803966 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cz8gx" podStartSLOduration=90.803944806 podStartE2EDuration="1m30.803944806s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.788112153 +0000 UTC m=+110.805552463" watchObservedRunningTime="2025-11-27 16:39:58.803944806 +0000 UTC m=+110.821385106" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.821134 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podStartSLOduration=90.821113721 podStartE2EDuration="1m30.821113721s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.820590888 +0000 UTC m=+110.838031188" watchObservedRunningTime="2025-11-27 16:39:58.821113721 +0000 UTC m=+110.838554021" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.857012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.857048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.857059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.857098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.857111 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.872408 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=33.87238416 podStartE2EDuration="33.87238416s" podCreationTimestamp="2025-11-27 16:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.871204651 +0000 UTC m=+110.888644971" watchObservedRunningTime="2025-11-27 16:39:58.87238416 +0000 UTC m=+110.889824470" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.911816 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lt9bl" podStartSLOduration=90.911789845 podStartE2EDuration="1m30.911789845s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.897120922 +0000 UTC m=+110.914561252" watchObservedRunningTime="2025-11-27 16:39:58.911789845 +0000 UTC m=+110.929230155" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.938899 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=61.938853045 podStartE2EDuration="1m1.938853045s" podCreationTimestamp="2025-11-27 16:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.91196221 +0000 UTC m=+110.929402520" watchObservedRunningTime="2025-11-27 16:39:58.938853045 +0000 UTC m=+110.956293355" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.954357 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.954330348 podStartE2EDuration="6.954330348s" podCreationTimestamp="2025-11-27 16:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.93948798 +0000 UTC m=+110.956928280" watchObservedRunningTime="2025-11-27 16:39:58.954330348 +0000 UTC m=+110.971770648" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.959193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.959240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.959252 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.959284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.959297 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:58Z","lastTransitionTime":"2025-11-27T16:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.971934 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.971910484 podStartE2EDuration="1m31.971910484s" podCreationTimestamp="2025-11-27 16:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.954961164 +0000 UTC m=+110.972401464" watchObservedRunningTime="2025-11-27 16:39:58.971910484 +0000 UTC m=+110.989350784" Nov 27 16:39:58 crc kubenswrapper[4954]: I1127 16:39:58.986036 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-27v67" podStartSLOduration=90.986009383 podStartE2EDuration="1m30.986009383s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:58.985213752 +0000 UTC m=+111.002654062" watchObservedRunningTime="2025-11-27 16:39:58.986009383 +0000 UTC m=+111.003449683" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.003901 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9mb96" podStartSLOduration=91.003879914 podStartE2EDuration="1m31.003879914s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:59.003774882 +0000 UTC m=+111.021215192" watchObservedRunningTime="2025-11-27 16:39:59.003879914 +0000 UTC m=+111.021320234" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.061493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.061546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.061555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.061574 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.061815 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.169477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.169543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.169556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.169576 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.169631 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.273277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.273351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.273366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.273391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.273408 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.380868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.380909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.380920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.380937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.380949 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.484614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.484667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.484680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.484702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.484715 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.588312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.588360 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.588372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.588388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.588412 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.661157 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.661239 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:39:59 crc kubenswrapper[4954]: E1127 16:39:59.661332 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:39:59 crc kubenswrapper[4954]: E1127 16:39:59.661458 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.661576 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.661692 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:39:59 crc kubenswrapper[4954]: E1127 16:39:59.661758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:39:59 crc kubenswrapper[4954]: E1127 16:39:59.661882 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.691112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.691164 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.691178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.691202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.691220 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.794503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.794631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.794645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.794667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.794680 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.897753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.897800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.897812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.897829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:39:59 crc kubenswrapper[4954]: I1127 16:39:59.897841 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:39:59Z","lastTransitionTime":"2025-11-27T16:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.001609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.001673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.001691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.001721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.001741 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.104564 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.104641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.104653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.104706 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.104718 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.207983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.208039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.208050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.208068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.208079 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.311424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.311469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.311480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.311499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.311517 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.413633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.413702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.413721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.413743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.413757 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.516041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.516099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.516112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.516136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.516158 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.619523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.619594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.619612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.619635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.619650 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.722876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.722955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.722979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.723009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.723028 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.827415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.827473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.827489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.827513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.827531 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.930804 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.930896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.930933 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.930966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:00 crc kubenswrapper[4954]: I1127 16:40:00.930988 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:00Z","lastTransitionTime":"2025-11-27T16:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.034969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.035054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.035073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.035100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.035123 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.139071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.139128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.139140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.139159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.139175 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.242057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.242170 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.242197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.242229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.242253 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.345296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.345360 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.345370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.345385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.345394 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.447729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.447812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.447837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.447872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.447899 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.552685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.552798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.552832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.552868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.552909 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.656361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.656417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.656427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.656446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.656459 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.661834 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.661977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:01 crc kubenswrapper[4954]: E1127 16:40:01.662041 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.662046 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.661878 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:01 crc kubenswrapper[4954]: E1127 16:40:01.662241 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:01 crc kubenswrapper[4954]: E1127 16:40:01.662363 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:01 crc kubenswrapper[4954]: E1127 16:40:01.662490 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.760277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.760341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.760359 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.760385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.760404 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.864509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.864617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.864632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.864651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.864663 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.968260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.968330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.968347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.968374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:01 crc kubenswrapper[4954]: I1127 16:40:01.968393 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:01Z","lastTransitionTime":"2025-11-27T16:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.072213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.072272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.072286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.072311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.072329 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.175428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.175495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.175512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.175539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.175557 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.278493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.278540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.278599 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.278619 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.278631 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.381539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.381601 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.381615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.381636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.381652 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.484656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.484701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.484715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.484736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.484752 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.588020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.588074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.588090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.588110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.588124 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.690440 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.690489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.690504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.690518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.690528 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.794420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.794486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.794530 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.794571 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.794658 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.898204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.898295 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.898320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.898356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:02 crc kubenswrapper[4954]: I1127 16:40:02.898381 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:02Z","lastTransitionTime":"2025-11-27T16:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.001894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.001957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.001971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.001997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.002011 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.105050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.105131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.105148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.105167 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.105180 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.208520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.208636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.208657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.208684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.208703 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.312281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.312340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.312355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.312376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.312390 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.415999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.416385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.416922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.417069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.417191 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.425159 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/1.log" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.426038 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/0.log" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.426132 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5bda3ef-ba2c-424a-ba4a-432053d1c40d" containerID="bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884" exitCode=1 Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.426191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerDied","Data":"bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.426250 4954 scope.go:117] "RemoveContainer" containerID="3d5aabb55ded9f58e618e465b5ef892a9098df73cc03b0d2de615dbcb754cd4d" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.427251 4954 scope.go:117] "RemoveContainer" containerID="bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884" Nov 27 16:40:03 crc kubenswrapper[4954]: E1127 16:40:03.427738 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9mb96_openshift-multus(c5bda3ef-ba2c-424a-ba4a-432053d1c40d)\"" pod="openshift-multus/multus-9mb96" podUID="c5bda3ef-ba2c-424a-ba4a-432053d1c40d" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.453323 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j2bxm" podStartSLOduration=94.453237355 podStartE2EDuration="1m34.453237355s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:39:59.017953083 +0000 UTC m=+111.035393383" watchObservedRunningTime="2025-11-27 16:40:03.453237355 +0000 UTC m=+115.470677725" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.522704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.522808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.522847 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.522887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.522926 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.626433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.626623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.626651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.626682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.626711 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.662048 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.662073 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.662048 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:03 crc kubenswrapper[4954]: E1127 16:40:03.662205 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:03 crc kubenswrapper[4954]: E1127 16:40:03.662318 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.662429 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:03 crc kubenswrapper[4954]: E1127 16:40:03.662427 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:03 crc kubenswrapper[4954]: E1127 16:40:03.662763 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.729966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.730435 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.730719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.730973 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.731215 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.834882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.834934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.834945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.834964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.834976 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.937703 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.937765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.937780 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.937803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:03 crc kubenswrapper[4954]: I1127 16:40:03.937816 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:03Z","lastTransitionTime":"2025-11-27T16:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.041483 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.041555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.041568 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.041615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.041633 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.145334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.145397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.145419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.145455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.145481 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.250996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.251511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.251809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.252020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.252222 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.355251 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.355304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.355319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.355341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.355354 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.431677 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/1.log" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.459443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.459879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.459948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.460025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.460108 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.564026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.564104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.564117 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.564157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.564169 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.666884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.666931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.666943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.666960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.666972 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.770594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.770638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.770650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.770668 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.770681 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.873819 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.873861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.873869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.873884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.873894 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.976417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.976515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.976534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.976561 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:04 crc kubenswrapper[4954]: I1127 16:40:04.976609 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:04Z","lastTransitionTime":"2025-11-27T16:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.079779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.079830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.079842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.079859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.079873 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.183512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.183626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.183641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.183662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.183678 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.286548 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.286671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.286754 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.286841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.286913 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.390794 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.390855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.390868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.390888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.390901 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.493822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.493888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.493908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.493933 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.493950 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.596753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.596839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.596857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.596884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.596903 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.661760 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.661842 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.661844 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.661881 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:05 crc kubenswrapper[4954]: E1127 16:40:05.661961 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:05 crc kubenswrapper[4954]: E1127 16:40:05.662181 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:05 crc kubenswrapper[4954]: E1127 16:40:05.662283 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:05 crc kubenswrapper[4954]: E1127 16:40:05.662410 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.700426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.700480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.700490 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.700511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.700523 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.804338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.804444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.804468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.804498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.804523 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.907152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.907269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.907293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.907330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:05 crc kubenswrapper[4954]: I1127 16:40:05.907356 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:05Z","lastTransitionTime":"2025-11-27T16:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.011556 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.011683 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.012041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.012071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.012088 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.116065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.116124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.116137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.116160 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.116183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.220110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.220198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.220221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.220253 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.220277 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.324725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.324804 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.324841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.324889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.324927 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.428229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.428277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.428340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.428367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.428385 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.531093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.531132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.531143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.531159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.531171 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.634701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.634766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.634787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.634819 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.634836 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.737850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.737916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.737929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.737952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.737965 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.840957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.841019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.841031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.841052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.841070 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.944247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.944330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.944340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.944357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:06 crc kubenswrapper[4954]: I1127 16:40:06.944374 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:06Z","lastTransitionTime":"2025-11-27T16:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.047070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.047158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.047176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.047205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.047223 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.156737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.156792 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.156804 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.156822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.156834 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.260527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.260600 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.260611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.260636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.260647 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.363650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.363787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.363821 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.363858 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.363885 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.466378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.466432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.466452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.466480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.466498 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.569935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.569989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.570002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.570024 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.570042 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.661361 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.661456 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.661477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.661361 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:07 crc kubenswrapper[4954]: E1127 16:40:07.661710 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:07 crc kubenswrapper[4954]: E1127 16:40:07.661858 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:07 crc kubenswrapper[4954]: E1127 16:40:07.662071 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:07 crc kubenswrapper[4954]: E1127 16:40:07.662291 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.673425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.673509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.673531 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.673558 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.673619 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.675204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.675241 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.675266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.675285 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.675299 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:40:07Z","lastTransitionTime":"2025-11-27T16:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.732948 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp"] Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.733780 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.737674 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.737783 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.738111 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.738183 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.831099 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.831238 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.831298 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.831331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.831355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932819 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932832 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932896 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.932973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.934243 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.949849 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:07 crc kubenswrapper[4954]: I1127 16:40:07.955101 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rsslp\" (UID: \"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:08 crc kubenswrapper[4954]: I1127 16:40:08.058097 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" Nov 27 16:40:08 crc kubenswrapper[4954]: I1127 16:40:08.451800 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" event={"ID":"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e","Type":"ContainerStarted","Data":"0a447d71f6c958b4d9be8b6d99ffb6e08d873a51c09403a69bc379da16130e54"} Nov 27 16:40:08 crc kubenswrapper[4954]: I1127 16:40:08.452158 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" event={"ID":"7fb32518-f7c8-4d4c-a8b2-26dacf6b8b5e","Type":"ContainerStarted","Data":"7933767ecf24c2cb36a50ec7413739a551821e8357c6f90c5af809127e5bf8f1"} Nov 27 16:40:08 crc kubenswrapper[4954]: I1127 16:40:08.470792 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rsslp" podStartSLOduration=100.47076055 podStartE2EDuration="1m40.47076055s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:08.469430597 +0000 UTC m=+120.486870907" watchObservedRunningTime="2025-11-27 16:40:08.47076055 +0000 UTC m=+120.488200890" Nov 27 16:40:08 crc kubenswrapper[4954]: E1127 16:40:08.600610 4954 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 27 16:40:08 crc kubenswrapper[4954]: E1127 16:40:08.816547 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 16:40:09 crc kubenswrapper[4954]: I1127 16:40:09.661477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:09 crc kubenswrapper[4954]: I1127 16:40:09.661548 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:09 crc kubenswrapper[4954]: I1127 16:40:09.661548 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:09 crc kubenswrapper[4954]: E1127 16:40:09.662041 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:09 crc kubenswrapper[4954]: I1127 16:40:09.662115 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:09 crc kubenswrapper[4954]: E1127 16:40:09.662393 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:09 crc kubenswrapper[4954]: E1127 16:40:09.707570 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:09 crc kubenswrapper[4954]: E1127 16:40:09.707621 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:09 crc kubenswrapper[4954]: I1127 16:40:09.708452 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.462316 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/3.log" Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.465770 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerStarted","Data":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.466380 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.833685 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podStartSLOduration=101.833654885 podStartE2EDuration="1m41.833654885s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:10.500436307 +0000 UTC m=+122.517876607" watchObservedRunningTime="2025-11-27 16:40:10.833654885 +0000 UTC m=+122.851095195" Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.834023 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgsvh"] Nov 27 16:40:10 crc kubenswrapper[4954]: I1127 16:40:10.834144 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:10 crc kubenswrapper[4954]: E1127 16:40:10.834265 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:11 crc kubenswrapper[4954]: I1127 16:40:11.661863 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:11 crc kubenswrapper[4954]: I1127 16:40:11.661882 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:11 crc kubenswrapper[4954]: E1127 16:40:11.662347 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:11 crc kubenswrapper[4954]: E1127 16:40:11.662472 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:11 crc kubenswrapper[4954]: I1127 16:40:11.661917 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:11 crc kubenswrapper[4954]: E1127 16:40:11.662650 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:12 crc kubenswrapper[4954]: I1127 16:40:12.661560 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:12 crc kubenswrapper[4954]: E1127 16:40:12.661703 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:13 crc kubenswrapper[4954]: I1127 16:40:13.661258 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:13 crc kubenswrapper[4954]: I1127 16:40:13.661346 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:13 crc kubenswrapper[4954]: E1127 16:40:13.661435 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:13 crc kubenswrapper[4954]: E1127 16:40:13.661518 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:13 crc kubenswrapper[4954]: I1127 16:40:13.661654 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:13 crc kubenswrapper[4954]: E1127 16:40:13.661879 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:13 crc kubenswrapper[4954]: E1127 16:40:13.818662 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 16:40:14 crc kubenswrapper[4954]: I1127 16:40:14.661274 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:14 crc kubenswrapper[4954]: E1127 16:40:14.661423 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:15 crc kubenswrapper[4954]: I1127 16:40:15.661501 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:15 crc kubenswrapper[4954]: I1127 16:40:15.661612 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:15 crc kubenswrapper[4954]: I1127 16:40:15.661535 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:15 crc kubenswrapper[4954]: E1127 16:40:15.661788 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:15 crc kubenswrapper[4954]: E1127 16:40:15.661883 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:15 crc kubenswrapper[4954]: E1127 16:40:15.662142 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:16 crc kubenswrapper[4954]: I1127 16:40:16.661410 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:16 crc kubenswrapper[4954]: E1127 16:40:16.661697 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:16 crc kubenswrapper[4954]: I1127 16:40:16.662999 4954 scope.go:117] "RemoveContainer" containerID="bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884" Nov 27 16:40:17 crc kubenswrapper[4954]: I1127 16:40:17.493737 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/1.log" Nov 27 16:40:17 crc kubenswrapper[4954]: I1127 16:40:17.494466 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerStarted","Data":"34f4a3bb92c39c5db5b427259524720518191fb6e9a74d427133a9d815df637d"} Nov 27 16:40:17 crc kubenswrapper[4954]: I1127 16:40:17.662145 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:17 crc kubenswrapper[4954]: E1127 16:40:17.662340 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:40:17 crc kubenswrapper[4954]: I1127 16:40:17.662692 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:17 crc kubenswrapper[4954]: E1127 16:40:17.662812 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:40:17 crc kubenswrapper[4954]: I1127 16:40:17.662918 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:17 crc kubenswrapper[4954]: E1127 16:40:17.663016 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:40:18 crc kubenswrapper[4954]: I1127 16:40:18.661377 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:18 crc kubenswrapper[4954]: E1127 16:40:18.663332 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgsvh" podUID="af5183f4-5f46-4d64-8ec4-c7b71530cad6" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.662112 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.662153 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.662827 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.665430 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.665879 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.666559 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 16:40:19 crc kubenswrapper[4954]: I1127 16:40:19.669972 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 16:40:20 crc kubenswrapper[4954]: I1127 16:40:20.661187 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:20 crc kubenswrapper[4954]: I1127 16:40:20.664228 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 16:40:20 crc kubenswrapper[4954]: I1127 16:40:20.664339 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.189759 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.238699 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.239224 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.243089 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ltms"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.243410 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.243634 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9gfl4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.243879 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.245762 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.245996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.246035 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.246058 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.246086 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.246144 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-62hrj"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.246195 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.252653 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.254538 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.255307 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.255710 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.255730 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.262021 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.263668 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htccg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.262131 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.262423 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.262827 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.265513 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.267316 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.285326 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.288333 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h48pg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.289252 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.290258 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.290550 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.290714 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.290891 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291026 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291187 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291389 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291528 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291751 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.291614 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.292226 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.292547 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.292747 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.292631 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.292969 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.297387 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.297626 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.298351 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjllc"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.298812 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.299242 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.299709 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.299986 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.301943 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.302484 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.303557 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.303982 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.307514 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.305750 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304015 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.307798 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304141 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304219 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304289 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304291 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304334 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304458 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304556 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304620 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.304672 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.308403 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.305482 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306438 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306442 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306559 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.308923 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.309103 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306615 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.309309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306815 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.309456 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306858 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306901 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306911 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306944 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.306983 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.309648 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.310054 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.310719 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.310740 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.310740 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.312748 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.313184 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.315952 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316176 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316474 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316666 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316772 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316870 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316895 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.316981 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.317058 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.317158 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.317689 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.317773 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.318315 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.322059 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-m78xr"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.322683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.323890 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.334327 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.334851 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.338697 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.341097 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.343344 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.346153 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.347588 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.347757 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.350022 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.350085 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.351644 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.384164 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.384228 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.384351 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.384458 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.385934 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.386564 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.392597 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.394935 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395363 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395543 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-serving-cert\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395747 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-serving-cert\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395828 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395846 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-serving-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395882 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395898 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395917 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395938 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-images\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395971 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.395987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-encryption-config\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396006 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f943860-2a4f-44af-9695-4497a2a8fdd8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396026 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566pd\" (UniqueName: \"kubernetes.io/projected/3ddfcd6f-4387-40b4-9933-4e169797f6da-kube-api-access-566pd\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396043 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-policies\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396076 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac17f7ac-8454-45e6-af33-a29113eb0d66-metrics-tls\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396111 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49xd\" (UniqueName: \"kubernetes.io/projected/daf9759f-1f7d-4613-b734-a39f4552222e-kube-api-access-r49xd\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396127 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396164 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-encryption-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396182 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46l4\" (UniqueName: \"kubernetes.io/projected/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-kube-api-access-j46l4\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396225 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396244 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c39245-291c-4f46-88ef-80e78b1c7bae-serving-cert\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396261 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-service-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396278 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396298 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-config\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396329 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396347 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z825\" (UniqueName: \"kubernetes.io/projected/ac17f7ac-8454-45e6-af33-a29113eb0d66-kube-api-access-9z825\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396384 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thft\" (UniqueName: \"kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396402 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbqgs\" (UniqueName: \"kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396421 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-dir\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396439 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfcd6f-4387-40b4-9933-4e169797f6da-serving-cert\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396457 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-node-pullsecrets\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396493 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjmb\" (UniqueName: \"kubernetes.io/projected/c9c39245-291c-4f46-88ef-80e78b1c7bae-kube-api-access-jbjmb\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396514 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396792 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396909 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.396942 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397116 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397609 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397708 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.398955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-config\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397774 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399062 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-config\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399102 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-client\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhb94\" (UniqueName: \"kubernetes.io/projected/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-kube-api-access-bhb94\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399142 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit-dir\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399161 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx24\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-kube-api-access-9fx24\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399217 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399257 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399305 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmpc\" (UniqueName: \"kubernetes.io/projected/56bfbfaa-4d26-4361-87fc-dab870bdff96-kube-api-access-4xmpc\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-image-import-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399364 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399384 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f943860-2a4f-44af-9695-4497a2a8fdd8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6slx\" (UniqueName: \"kubernetes.io/projected/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-kube-api-access-x6slx\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399425 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-client\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/daf9759f-1f7d-4613-b734-a39f4552222e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-trusted-ca\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgg9p\" (UniqueName: \"kubernetes.io/projected/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-kube-api-access-kgg9p\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.399560 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56bfbfaa-4d26-4361-87fc-dab870bdff96-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.400013 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397899 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.397957 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.402437 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.403023 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.403433 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.403562 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.403812 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.405782 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.405798 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.405816 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.405938 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.406035 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.407547 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.408921 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.408920 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.409834 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.410684 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.412602 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.414809 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.416865 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.417331 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.417838 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.422012 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.424165 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2jvzc"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.425716 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.433357 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.441528 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.444424 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.446347 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.457010 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.464197 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.464689 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cc84q"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.466043 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8hmz"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.466743 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.467168 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.467179 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.467967 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.468986 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.469795 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.470425 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.471143 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.474567 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.477104 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.477849 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.477884 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.478017 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.479060 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnp9p"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.479525 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9gfl4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.479874 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.480108 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.481997 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.484537 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.485551 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.488641 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-62hrj"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.490217 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.492189 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.493208 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.497008 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.497828 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ltms"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.499101 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501249 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54811547-c0f2-4b3e-8e07-6b6c878d72ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501300 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bfdabf8-f787-45ba-916e-a40db8dd9561-proxy-tls\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566pd\" (UniqueName: \"kubernetes.io/projected/3ddfcd6f-4387-40b4-9933-4e169797f6da-kube-api-access-566pd\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501431 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f943860-2a4f-44af-9695-4497a2a8fdd8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501464 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p942l\" (UniqueName: \"kubernetes.io/projected/6606df87-becb-460d-8579-22c5eb23e71a-kube-api-access-p942l\") pod \"downloads-7954f5f757-m78xr\" (UID: \"6606df87-becb-460d-8579-22c5eb23e71a\") " pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501514 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58eb1d01-1f82-43fa-8ace-86368d05ec71-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcl54\" (UniqueName: \"kubernetes.io/projected/58eb1d01-1f82-43fa-8ace-86368d05ec71-kube-api-access-hcl54\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501625 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-policies\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501568 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501737 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac17f7ac-8454-45e6-af33-a29113eb0d66-metrics-tls\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501785 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501824 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49xd\" (UniqueName: \"kubernetes.io/projected/daf9759f-1f7d-4613-b734-a39f4552222e-kube-api-access-r49xd\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501896 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501920 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-encryption-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501941 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46l4\" (UniqueName: \"kubernetes.io/projected/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-kube-api-access-j46l4\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.501988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502010 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502029 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-service-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-config\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502132 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502151 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c39245-291c-4f46-88ef-80e78b1c7bae-serving-cert\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z825\" (UniqueName: \"kubernetes.io/projected/ac17f7ac-8454-45e6-af33-a29113eb0d66-kube-api-access-9z825\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502196 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thft\" (UniqueName: \"kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502217 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54811547-c0f2-4b3e-8e07-6b6c878d72ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbqgs\" (UniqueName: \"kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502262 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-dir\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502279 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfcd6f-4387-40b4-9933-4e169797f6da-serving-cert\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-node-pullsecrets\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzlm\" (UniqueName: \"kubernetes.io/projected/072ec696-c152-40bb-8783-72920846a193-kube-api-access-gmzlm\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-srv-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502380 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjmb\" (UniqueName: \"kubernetes.io/projected/c9c39245-291c-4f46-88ef-80e78b1c7bae-kube-api-access-jbjmb\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502417 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-trusted-ca\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502436 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502491 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-config\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502548 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4p9k\" (UniqueName: \"kubernetes.io/projected/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-kube-api-access-d4p9k\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502565 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369df47c-55e0-41da-bb67-b99bb189b870-config\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502620 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bfdabf8-f787-45ba-916e-a40db8dd9561-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502655 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-metrics-tls\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502680 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-config\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502689 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-policies\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502724 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369df47c-55e0-41da-bb67-b99bb189b870-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502784 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhb94\" (UniqueName: \"kubernetes.io/projected/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-kube-api-access-bhb94\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.502838 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503315 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit-dir\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503345 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503523 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f943860-2a4f-44af-9695-4497a2a8fdd8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503748 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-config\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503815 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htccg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503878 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h48pg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503319 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.503819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.504069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.504300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-service-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.504666 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.505204 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-node-pullsecrets\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.505259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.505655 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-audit-dir\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506594 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-client\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54811547-c0f2-4b3e-8e07-6b6c878d72ee-config\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506707 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-config\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.506730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507339 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507635 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507650 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx24\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-kube-api-access-9fx24\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507847 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/58db67ba-0f90-4190-8beb-02489a6e6a1a-kube-api-access-rqgwl\") pod \"migrator-59844c95c7-6h576\" (UID: \"58db67ba-0f90-4190-8beb-02489a6e6a1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmpc\" (UniqueName: \"kubernetes.io/projected/56bfbfaa-4d26-4361-87fc-dab870bdff96-kube-api-access-4xmpc\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507911 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b335decc-f67f-47e1-bee9-8d3033151b92-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.507990 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508030 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4svv\" (UniqueName: \"kubernetes.io/projected/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-kube-api-access-c4svv\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-audit-dir\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508187 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508190 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-image-import-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508301 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-config\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508478 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pjxp\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-kube-api-access-4pjxp\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508607 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508655 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f943860-2a4f-44af-9695-4497a2a8fdd8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508778 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508848 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.508914 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6slx\" (UniqueName: \"kubernetes.io/projected/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-kube-api-access-x6slx\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509419 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99ggq\" (UniqueName: \"kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509482 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58eb1d01-1f82-43fa-8ace-86368d05ec71-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509442 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-image-import-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509574 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-client\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509667 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/daf9759f-1f7d-4613-b734-a39f4552222e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509719 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-trusted-ca\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509746 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369df47c-55e0-41da-bb67-b99bb189b870-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwwt\" (UniqueName: \"kubernetes.io/projected/b335decc-f67f-47e1-bee9-8d3033151b92-kube-api-access-fgwwt\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56bfbfaa-4d26-4361-87fc-dab870bdff96-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.509967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510003 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgg9p\" (UniqueName: \"kubernetes.io/projected/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-kube-api-access-kgg9p\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-serving-cert\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510056 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-serving-cert\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510081 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/072ec696-c152-40bb-8783-72920846a193-proxy-tls\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510112 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510136 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-serving-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510206 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c39245-291c-4f46-88ef-80e78b1c7bae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510236 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510877 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.510979 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-serving-ca\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511000 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-encryption-config\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511375 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-images\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511341 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmklk\" (UniqueName: \"kubernetes.io/projected/4bfdabf8-f787-45ba-916e-a40db8dd9561-kube-api-access-nmklk\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511522 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511629 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-encryption-config\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511727 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-images\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511762 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b335decc-f67f-47e1-bee9-8d3033151b92-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.511894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac17f7ac-8454-45e6-af33-a29113eb0d66-metrics-tls\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.512353 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfcd6f-4387-40b4-9933-4e169797f6da-trusted-ca\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.512375 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.513573 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f943860-2a4f-44af-9695-4497a2a8fdd8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.513654 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/daf9759f-1f7d-4613-b734-a39f4552222e-images\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.513711 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.513991 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.514014 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.514062 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.515341 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56bfbfaa-4d26-4361-87fc-dab870bdff96-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.519453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.519634 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-serving-cert\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.519817 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfcd6f-4387-40b4-9933-4e169797f6da-serving-cert\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.519932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-serving-cert\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.520173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-encryption-config\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.520195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-etcd-client\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.520362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.520659 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/daf9759f-1f7d-4613-b734-a39f4552222e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.520722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.521007 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.521081 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.521330 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.521846 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.522213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.522260 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl5dh"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.525105 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.525322 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.525764 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.527227 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cc84q"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.528976 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.531435 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.533251 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-etcd-client\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.534238 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.535465 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.536554 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjllc"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.537689 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl5dh"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.540323 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qm5wz"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.541060 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4rd2p"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.541169 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.541734 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.541782 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m78xr"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.542810 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.543067 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.543986 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.545706 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.545970 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.547156 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.548744 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.549978 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.551485 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.552085 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rd2p"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.553156 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnp9p"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.554534 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.555386 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.556552 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8hmz"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.557649 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5clct"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.558736 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5clct" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.559765 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5clct"] Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.562853 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.583454 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.601939 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612428 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612473 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-cabundle\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612496 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac0eaee-22bb-4194-bb62-0622b65c778b-config\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612520 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-srv-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612573 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369df47c-55e0-41da-bb67-b99bb189b870-config\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612614 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369df47c-55e0-41da-bb67-b99bb189b870-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612674 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4svv\" (UniqueName: \"kubernetes.io/projected/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-kube-api-access-c4svv\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612692 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfx7\" (UniqueName: \"kubernetes.io/projected/7ac0eaee-22bb-4194-bb62-0622b65c778b-kube-api-access-vnfx7\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612727 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-default-certificate\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612750 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/072ec696-c152-40bb-8783-72920846a193-proxy-tls\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612796 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6k8l\" (UniqueName: \"kubernetes.io/projected/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-kube-api-access-w6k8l\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612833 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-apiservice-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612851 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnd8x\" (UniqueName: \"kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612870 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-metrics-certs\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612888 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb2nw\" (UniqueName: \"kubernetes.io/projected/ce757d32-af85-4142-9beb-95ac115d61d7-kube-api-access-xb2nw\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612906 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-images\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612926 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54811547-c0f2-4b3e-8e07-6b6c878d72ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jck\" (UniqueName: \"kubernetes.io/projected/25fce237-9917-431e-b345-ef24a715fd12-kube-api-access-h9jck\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.612988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613054 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-webhook-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613082 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54811547-c0f2-4b3e-8e07-6b6c878d72ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613105 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613123 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613149 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzlm\" (UniqueName: \"kubernetes.io/projected/072ec696-c152-40bb-8783-72920846a193-kube-api-access-gmzlm\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613168 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-trusted-ca\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613210 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613229 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4p9k\" (UniqueName: \"kubernetes.io/projected/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-kube-api-access-d4p9k\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613247 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bfdabf8-f787-45ba-916e-a40db8dd9561-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-metrics-tls\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613283 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-key\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613314 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54811547-c0f2-4b3e-8e07-6b6c878d72ee-config\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613330 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613350 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/58db67ba-0f90-4190-8beb-02489a6e6a1a-kube-api-access-rqgwl\") pod \"migrator-59844c95c7-6h576\" (UID: \"58db67ba-0f90-4190-8beb-02489a6e6a1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613380 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b335decc-f67f-47e1-bee9-8d3033151b92-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613397 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkkss\" (UniqueName: \"kubernetes.io/projected/80debf9d-f71d-491f-b914-82597c9d3162-kube-api-access-nkkss\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613424 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pjxp\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-kube-api-access-4pjxp\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613463 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99ggq\" (UniqueName: \"kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613481 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58eb1d01-1f82-43fa-8ace-86368d05ec71-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369df47c-55e0-41da-bb67-b99bb189b870-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwwt\" (UniqueName: \"kubernetes.io/projected/b335decc-f67f-47e1-bee9-8d3033151b92-kube-api-access-fgwwt\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613545 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80debf9d-f71d-491f-b914-82597c9d3162-service-ca-bundle\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613625 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmklk\" (UniqueName: \"kubernetes.io/projected/4bfdabf8-f787-45ba-916e-a40db8dd9561-kube-api-access-nmklk\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce757d32-af85-4142-9beb-95ac115d61d7-tmpfs\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b335decc-f67f-47e1-bee9-8d3033151b92-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613682 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac0eaee-22bb-4194-bb62-0622b65c778b-serving-cert\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613709 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bfdabf8-f787-45ba-916e-a40db8dd9561-proxy-tls\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p942l\" (UniqueName: \"kubernetes.io/projected/6606df87-becb-460d-8579-22c5eb23e71a-kube-api-access-p942l\") pod \"downloads-7954f5f757-m78xr\" (UID: \"6606df87-becb-460d-8579-22c5eb23e71a\") " pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613750 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58eb1d01-1f82-43fa-8ace-86368d05ec71-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcl54\" (UniqueName: \"kubernetes.io/projected/58eb1d01-1f82-43fa-8ace-86368d05ec71-kube-api-access-hcl54\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613798 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.613818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-stats-auth\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.614701 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369df47c-55e0-41da-bb67-b99bb189b870-config\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.615238 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.616116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54811547-c0f2-4b3e-8e07-6b6c878d72ee-config\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.616814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.616995 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.617534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.618437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.618517 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-trusted-ca\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.623098 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58eb1d01-1f82-43fa-8ace-86368d05ec71-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.623976 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58eb1d01-1f82-43fa-8ace-86368d05ec71-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.626913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b335decc-f67f-47e1-bee9-8d3033151b92-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.628499 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.629222 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.631048 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54811547-c0f2-4b3e-8e07-6b6c878d72ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.636009 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.655312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369df47c-55e0-41da-bb67-b99bb189b870-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.655856 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-metrics-tls\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.656553 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.660004 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b335decc-f67f-47e1-bee9-8d3033151b92-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.661502 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.681563 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.691278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bfdabf8-f787-45ba-916e-a40db8dd9561-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.692912 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c39245-291c-4f46-88ef-80e78b1c7bae-serving-cert\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.694184 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bfdabf8-f787-45ba-916e-a40db8dd9561-proxy-tls\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715363 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb2nw\" (UniqueName: \"kubernetes.io/projected/ce757d32-af85-4142-9beb-95ac115d61d7-kube-api-access-xb2nw\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715423 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jck\" (UniqueName: \"kubernetes.io/projected/25fce237-9917-431e-b345-ef24a715fd12-kube-api-access-h9jck\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-webhook-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-key\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715739 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkkss\" (UniqueName: \"kubernetes.io/projected/80debf9d-f71d-491f-b914-82597c9d3162-kube-api-access-nkkss\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80debf9d-f71d-491f-b914-82597c9d3162-service-ca-bundle\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce757d32-af85-4142-9beb-95ac115d61d7-tmpfs\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715883 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac0eaee-22bb-4194-bb62-0622b65c778b-serving-cert\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715931 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-stats-auth\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.715995 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac0eaee-22bb-4194-bb62-0622b65c778b-config\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-cabundle\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716086 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfx7\" (UniqueName: \"kubernetes.io/projected/7ac0eaee-22bb-4194-bb62-0622b65c778b-kube-api-access-vnfx7\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716114 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-default-certificate\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716151 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-apiservice-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716171 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6k8l\" (UniqueName: \"kubernetes.io/projected/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-kube-api-access-w6k8l\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnd8x\" (UniqueName: \"kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716218 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-metrics-certs\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.716311 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce757d32-af85-4142-9beb-95ac115d61d7-tmpfs\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.721276 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.723780 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/072ec696-c152-40bb-8783-72920846a193-images\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.742868 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.761838 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.782136 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.792503 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.801697 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.823041 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.832462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/072ec696-c152-40bb-8783-72920846a193-proxy-tls\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.842084 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.861455 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.873288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.881654 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.886664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-srv-cert\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.901633 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.921759 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.943148 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.961943 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:40:28 crc kubenswrapper[4954]: I1127 16:40:28.984237 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.003012 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.022868 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.042282 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.073048 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.081755 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.102301 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.108113 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80debf9d-f71d-491f-b914-82597c9d3162-service-ca-bundle\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.121758 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.151184 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.162287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-default-certificate\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.162907 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.171698 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-stats-auth\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.183159 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.190971 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80debf9d-f71d-491f-b914-82597c9d3162-metrics-certs\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.203627 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.222942 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.242803 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.262404 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.282472 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.302322 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.322420 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.342313 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.349223 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-key\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.361345 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.381953 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.387425 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-signing-cabundle\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.405243 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.422734 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.442341 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.463346 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.480158 4954 request.go:700] Waited for 1.011901644s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.482687 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.490003 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-webhook-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.492151 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce757d32-af85-4142-9beb-95ac115d61d7-apiservice-cert\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.503204 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.523024 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.523555 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.552995 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.558548 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.568481 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.582319 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.601522 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.607334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac0eaee-22bb-4194-bb62-0622b65c778b-config\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.622202 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.641766 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.651092 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac0eaee-22bb-4194-bb62-0622b65c778b-serving-cert\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.662684 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.681675 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.701511 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:40:29 crc kubenswrapper[4954]: E1127 16:40:29.716754 4954 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 16:40:29 crc kubenswrapper[4954]: E1127 16:40:29.716806 4954 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 27 16:40:29 crc kubenswrapper[4954]: E1127 16:40:29.716932 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert podName:25fce237-9917-431e-b345-ef24a715fd12 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:30.21689875 +0000 UTC m=+142.234339070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-tj896" (UID: "25fce237-9917-431e-b345-ef24a715fd12") : failed to sync secret cache: timed out waiting for the condition Nov 27 16:40:29 crc kubenswrapper[4954]: E1127 16:40:29.717029 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config podName:25fce237-9917-431e-b345-ef24a715fd12 nodeName:}" failed. No retries permitted until 2025-11-27 16:40:30.217007213 +0000 UTC m=+142.234447523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config") pod "openshift-apiserver-operator-796bbdcf4f-tj896" (UID: "25fce237-9917-431e-b345-ef24a715fd12") : failed to sync configmap cache: timed out waiting for the condition Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.721514 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.742284 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.761639 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.781277 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.801839 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.823065 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.841635 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.861938 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.882667 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.902930 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.921734 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.943629 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.962189 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 16:40:29 crc kubenswrapper[4954]: I1127 16:40:29.982105 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.002695 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.022782 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.058325 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566pd\" (UniqueName: \"kubernetes.io/projected/3ddfcd6f-4387-40b4-9933-4e169797f6da-kube-api-access-566pd\") pod \"console-operator-58897d9998-9gfl4\" (UID: \"3ddfcd6f-4387-40b4-9933-4e169797f6da\") " pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.078994 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49xd\" (UniqueName: \"kubernetes.io/projected/daf9759f-1f7d-4613-b734-a39f4552222e-kube-api-access-r49xd\") pod \"machine-api-operator-5694c8668f-h48pg\" (UID: \"daf9759f-1f7d-4613-b734-a39f4552222e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.097946 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhb94\" (UniqueName: \"kubernetes.io/projected/0da11e6f-c84f-4d72-83cc-9bb32480b3d2-kube-api-access-bhb94\") pod \"openshift-config-operator-7777fb866f-7ltms\" (UID: \"0da11e6f-c84f-4d72-83cc-9bb32480b3d2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.110737 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.130752 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46l4\" (UniqueName: \"kubernetes.io/projected/3e7aebe6-3e4f-498f-a696-5e23f9fe313d-kube-api-access-j46l4\") pod \"apiserver-76f77b778f-htccg\" (UID: \"3e7aebe6-3e4f-498f-a696-5e23f9fe313d\") " pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.135336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbqgs\" (UniqueName: \"kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs\") pod \"oauth-openshift-558db77b4-6m2df\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.158890 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z825\" (UniqueName: \"kubernetes.io/projected/ac17f7ac-8454-45e6-af33-a29113eb0d66-kube-api-access-9z825\") pod \"dns-operator-744455d44c-mjllc\" (UID: \"ac17f7ac-8454-45e6-af33-a29113eb0d66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.173256 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.178422 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thft\" (UniqueName: \"kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft\") pod \"route-controller-manager-6576b87f9c-f6f2h\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.195837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.230157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjmb\" (UniqueName: \"kubernetes.io/projected/c9c39245-291c-4f46-88ef-80e78b1c7bae-kube-api-access-jbjmb\") pod \"authentication-operator-69f744f599-62hrj\" (UID: \"c9c39245-291c-4f46-88ef-80e78b1c7bae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.241713 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.242511 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.242830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.243436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fce237-9917-431e-b345-ef24a715fd12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.244249 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.245781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fce237-9917-431e-b345-ef24a715fd12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.255610 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx24\" (UniqueName: \"kubernetes.io/projected/6f943860-2a4f-44af-9695-4497a2a8fdd8-kube-api-access-9fx24\") pod \"cluster-image-registry-operator-dc59b4c8b-k9v6k\" (UID: \"6f943860-2a4f-44af-9695-4497a2a8fdd8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.258462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmpc\" (UniqueName: \"kubernetes.io/projected/56bfbfaa-4d26-4361-87fc-dab870bdff96-kube-api-access-4xmpc\") pod \"cluster-samples-operator-665b6dd947-mn7g4\" (UID: \"56bfbfaa-4d26-4361-87fc-dab870bdff96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.260161 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.268440 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.282287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6slx\" (UniqueName: \"kubernetes.io/projected/9bd46eeb-25a4-4e67-97ad-96c21224fbcd-kube-api-access-x6slx\") pod \"machine-approver-56656f9798-xqdst\" (UID: \"9bd46eeb-25a4-4e67-97ad-96c21224fbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.294035 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.301818 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.304781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgg9p\" (UniqueName: \"kubernetes.io/projected/5d68d684-47a3-490a-bafb-9c8f04f0d3fb-kube-api-access-kgg9p\") pod \"apiserver-7bbb656c7d-6lsxk\" (UID: \"5d68d684-47a3-490a-bafb-9c8f04f0d3fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.322233 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.342306 4954 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.362093 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.381851 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.403381 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.412860 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.419445 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ltms"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.422119 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.441816 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.441849 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.452420 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9gfl4"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.462420 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 16:40:30 crc kubenswrapper[4954]: W1127 16:40:30.471508 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd46eeb_25a4_4e67_97ad_96c21224fbcd.slice/crio-8db283fc46200bf311f8d200b7fdaa926fd031f29243d16fca090b9d4b7b97ae WatchSource:0}: Error finding container 8db283fc46200bf311f8d200b7fdaa926fd031f29243d16fca090b9d4b7b97ae: Status 404 returned error can't find the container with id 8db283fc46200bf311f8d200b7fdaa926fd031f29243d16fca090b9d4b7b97ae Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.481692 4954 request.go:700] Waited for 1.93971686s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.486211 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.504941 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.524469 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.524657 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.543658 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.549826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" event={"ID":"9bd46eeb-25a4-4e67-97ad-96c21224fbcd","Type":"ContainerStarted","Data":"8db283fc46200bf311f8d200b7fdaa926fd031f29243d16fca090b9d4b7b97ae"} Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.550486 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" event={"ID":"0da11e6f-c84f-4d72-83cc-9bb32480b3d2","Type":"ContainerStarted","Data":"9be8777fb791ed7d3ba9055f6379319357f9c9f4f5e1062effc6261ab27bf61e"} Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.563704 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.584994 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.611829 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.630802 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4p9k\" (UniqueName: \"kubernetes.io/projected/2fe78cc8-8ce0-4cdf-9dcf-a15624194cab-kube-api-access-d4p9k\") pod \"package-server-manager-789f6589d5-m77vv\" (UID: \"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.647361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54811547-c0f2-4b3e-8e07-6b6c878d72ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hqqc6\" (UID: \"54811547-c0f2-4b3e-8e07-6b6c878d72ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.683131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4svv\" (UniqueName: \"kubernetes.io/projected/a6bd5d5a-d026-46f4-8467-993d9a1a3a59-kube-api-access-c4svv\") pod \"olm-operator-6b444d44fb-f44h7\" (UID: \"a6bd5d5a-d026-46f4-8467-993d9a1a3a59\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.683342 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzlm\" (UniqueName: \"kubernetes.io/projected/072ec696-c152-40bb-8783-72920846a193-kube-api-access-gmzlm\") pod \"machine-config-operator-74547568cd-kpmsg\" (UID: \"072ec696-c152-40bb-8783-72920846a193\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.703835 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.710756 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.718204 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99ggq\" (UniqueName: \"kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq\") pod \"console-f9d7485db-s8cm2\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.726840 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.729025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pjxp\" (UniqueName: \"kubernetes.io/projected/ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf-kube-api-access-4pjxp\") pod \"ingress-operator-5b745b69d9-nngrv\" (UID: \"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.731241 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.748695 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwwt\" (UniqueName: \"kubernetes.io/projected/b335decc-f67f-47e1-bee9-8d3033151b92-kube-api-access-fgwwt\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfnws\" (UID: \"b335decc-f67f-47e1-bee9-8d3033151b92\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.764345 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369df47c-55e0-41da-bb67-b99bb189b870-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8svq7\" (UID: \"369df47c-55e0-41da-bb67-b99bb189b870\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.778825 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmklk\" (UniqueName: \"kubernetes.io/projected/4bfdabf8-f787-45ba-916e-a40db8dd9561-kube-api-access-nmklk\") pod \"machine-config-controller-84d6567774-prlg4\" (UID: \"4bfdabf8-f787-45ba-916e-a40db8dd9561\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.791080 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.797445 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.803619 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/58db67ba-0f90-4190-8beb-02489a6e6a1a-kube-api-access-rqgwl\") pod \"migrator-59844c95c7-6h576\" (UID: \"58db67ba-0f90-4190-8beb-02489a6e6a1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.828859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcl54\" (UniqueName: \"kubernetes.io/projected/58eb1d01-1f82-43fa-8ace-86368d05ec71-kube-api-access-hcl54\") pod \"openshift-controller-manager-operator-756b6f6bc6-bp7nq\" (UID: \"58eb1d01-1f82-43fa-8ace-86368d05ec71\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.846766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p942l\" (UniqueName: \"kubernetes.io/projected/6606df87-becb-460d-8579-22c5eb23e71a-kube-api-access-p942l\") pod \"downloads-7954f5f757-m78xr\" (UID: \"6606df87-becb-460d-8579-22c5eb23e71a\") " pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.854749 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.856715 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h48pg"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.861327 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htccg"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.870531 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.883510 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb2nw\" (UniqueName: \"kubernetes.io/projected/ce757d32-af85-4142-9beb-95ac115d61d7-kube-api-access-xb2nw\") pod \"packageserver-d55dfcdfc-cv9bx\" (UID: \"ce757d32-af85-4142-9beb-95ac115d61d7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.900545 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-62hrj"] Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.910714 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jck\" (UniqueName: \"kubernetes.io/projected/25fce237-9917-431e-b345-ef24a715fd12-kube-api-access-h9jck\") pod \"openshift-apiserver-operator-796bbdcf4f-tj896\" (UID: \"25fce237-9917-431e-b345-ef24a715fd12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.922418 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkkss\" (UniqueName: \"kubernetes.io/projected/80debf9d-f71d-491f-b914-82597c9d3162-kube-api-access-nkkss\") pod \"router-default-5444994796-2jvzc\" (UID: \"80debf9d-f71d-491f-b914-82597c9d3162\") " pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:30 crc kubenswrapper[4954]: W1127 16:40:30.934198 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e7aebe6_3e4f_498f_a696_5e23f9fe313d.slice/crio-a96a5a532909543c14b61978a73590410918b7573e697e011a582ddd5a739ed7 WatchSource:0}: Error finding container a96a5a532909543c14b61978a73590410918b7573e697e011a582ddd5a739ed7: Status 404 returned error can't find the container with id a96a5a532909543c14b61978a73590410918b7573e697e011a582ddd5a739ed7 Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.956841 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.957000 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfx7\" (UniqueName: \"kubernetes.io/projected/7ac0eaee-22bb-4194-bb62-0622b65c778b-kube-api-access-vnfx7\") pod \"service-ca-operator-777779d784-f9qbf\" (UID: \"7ac0eaee-22bb-4194-bb62-0622b65c778b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.957353 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.960051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6k8l\" (UniqueName: \"kubernetes.io/projected/2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd-kube-api-access-w6k8l\") pod \"service-ca-9c57cc56f-t8hmz\" (UID: \"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.964003 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.974786 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.977744 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.978549 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnd8x\" (UniqueName: \"kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x\") pod \"marketplace-operator-79b997595-qmz7n\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.983035 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" Nov 27 16:40:30 crc kubenswrapper[4954]: I1127 16:40:30.992596 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.029240 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjllc"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.056177 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4vr\" (UniqueName: \"kubernetes.io/projected/478fecf8-ff15-468e-a5f3-4b49e3e28654-kube-api-access-zl4vr\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074189 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a282fc0-c31c-440e-ae60-555e7e9aea66-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx25j\" (UniqueName: \"kubernetes.io/projected/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-kube-api-access-gx25j\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074274 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074344 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-client\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074433 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7jb\" (UniqueName: \"kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074465 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074637 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074659 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68w5b\" (UniqueName: \"kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074762 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmbx\" (UniqueName: \"kubernetes.io/projected/4442fdf4-4257-4f63-a247-5b2926cc5924-kube-api-access-9qmbx\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074791 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.074860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-profile-collector-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075102 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075122 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-srv-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075155 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-config\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075197 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kx4k\" (UniqueName: \"kubernetes.io/projected/34886065-3f55-42b2-820f-13b4d921fb85-kube-api-access-5kx4k\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075215 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a282fc0-c31c-440e-ae60-555e7e9aea66-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34886065-3f55-42b2-820f-13b4d921fb85-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-service-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075324 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075364 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075383 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-serving-cert\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075416 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a282fc0-c31c-440e-ae60-555e7e9aea66-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjttn\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.075462 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.076841 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.077847 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:31.577830563 +0000 UTC m=+143.595270863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.103174 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" Nov 27 16:40:31 crc kubenswrapper[4954]: W1127 16:40:31.115853 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac17f7ac_8454_45e6_af33_a29113eb0d66.slice/crio-0be82764f4232639284059e3be7284e3dc9a0b20a91868e2adc317d8bcb64d3d WatchSource:0}: Error finding container 0be82764f4232639284059e3be7284e3dc9a0b20a91868e2adc317d8bcb64d3d: Status 404 returned error can't find the container with id 0be82764f4232639284059e3be7284e3dc9a0b20a91868e2adc317d8bcb64d3d Nov 27 16:40:31 crc kubenswrapper[4954]: W1127 16:40:31.118181 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e521fb1_0565_4f66_a6f0_1b78942e408e.slice/crio-2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021 WatchSource:0}: Error finding container 2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021: Status 404 returned error can't find the container with id 2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021 Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.121599 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.130810 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.143763 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.160396 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176326 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176551 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-profile-collector-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.176621 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:31.676573978 +0000 UTC m=+143.694014278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176673 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v22df\" (UniqueName: \"kubernetes.io/projected/cc597c79-70a7-489d-b636-5876aa5a0e45-kube-api-access-v22df\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176721 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-srv-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176795 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.176818 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-config\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178038 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kx4k\" (UniqueName: \"kubernetes.io/projected/34886065-3f55-42b2-820f-13b4d921fb85-kube-api-access-5kx4k\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178091 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a282fc0-c31c-440e-ae60-555e7e9aea66-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-service-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178153 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34886065-3f55-42b2-820f-13b4d921fb85-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-node-bootstrap-token\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178248 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-registration-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178273 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178296 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.178359 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.180046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-serving-cert\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.180090 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a282fc0-c31c-440e-ae60-555e7e9aea66-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.180118 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjttn\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.180117 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.180524 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvp78\" (UniqueName: \"kubernetes.io/projected/9c31e969-aba0-4496-8891-283b9f639973-kube-api-access-mvp78\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-csi-data-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181783 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-certs\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181828 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4vr\" (UniqueName: \"kubernetes.io/projected/478fecf8-ff15-468e-a5f3-4b49e3e28654-kube-api-access-zl4vr\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181870 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a282fc0-c31c-440e-ae60-555e7e9aea66-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181915 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx25j\" (UniqueName: \"kubernetes.io/projected/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-kube-api-access-gx25j\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.181977 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqnq\" (UniqueName: \"kubernetes.io/projected/4a989c69-cca2-452c-987a-ecf2aba3b26d-kube-api-access-9jqnq\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182092 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-client\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182228 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182247 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7jb\" (UniqueName: \"kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182365 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-mountpoint-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182383 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-plugins-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182464 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98dff839-676a-4268-b3cc-cb4163fb1874-cert\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182485 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182556 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182573 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-socket-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68w5b\" (UniqueName: \"kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmbx\" (UniqueName: \"kubernetes.io/projected/4442fdf4-4257-4f63-a247-5b2926cc5924-kube-api-access-9qmbx\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182742 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182764 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zw8\" (UniqueName: \"kubernetes.io/projected/98dff839-676a-4268-b3cc-cb4163fb1874-kube-api-access-42zw8\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182846 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc597c79-70a7-489d-b636-5876aa5a0e45-config-volume\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.182864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc597c79-70a7-489d-b636-5876aa5a0e45-metrics-tls\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.203218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-service-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.204391 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.204950 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a282fc0-c31c-440e-ae60-555e7e9aea66-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.205421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-profile-collector-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.205885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.206123 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4442fdf4-4257-4f63-a247-5b2926cc5924-srv-cert\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.206248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34886065-3f55-42b2-820f-13b4d921fb85-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.206498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-config\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.209114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a282fc0-c31c-440e-ae60-555e7e9aea66-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.211599 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.206741 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.212525 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-ca\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.216887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.217552 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:31.717532905 +0000 UTC m=+143.734973205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.217555 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.219421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjttn\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.221382 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.233648 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.233608 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.240381 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.250766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: W1127 16:40:31.256192 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bfdabf8_f787_45ba_916e_a40db8dd9561.slice/crio-b9c5c4d4713429351197b1b05ead45630b6c28f68d8e2973647da218dd068545 WatchSource:0}: Error finding container b9c5c4d4713429351197b1b05ead45630b6c28f68d8e2973647da218dd068545: Status 404 returned error can't find the container with id b9c5c4d4713429351197b1b05ead45630b6c28f68d8e2973647da218dd068545 Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.258030 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kx4k\" (UniqueName: \"kubernetes.io/projected/34886065-3f55-42b2-820f-13b4d921fb85-kube-api-access-5kx4k\") pod \"multus-admission-controller-857f4d67dd-cc84q\" (UID: \"34886065-3f55-42b2-820f-13b4d921fb85\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.260249 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.271376 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a282fc0-c31c-440e-ae60-555e7e9aea66-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fh26g\" (UID: \"9a282fc0-c31c-440e-ae60-555e7e9aea66\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.271442 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.274374 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.290308 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.290778 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvp78\" (UniqueName: \"kubernetes.io/projected/9c31e969-aba0-4496-8891-283b9f639973-kube-api-access-mvp78\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.290819 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-csi-data-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.290844 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-certs\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.290914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqnq\" (UniqueName: \"kubernetes.io/projected/4a989c69-cca2-452c-987a-ecf2aba3b26d-kube-api-access-9jqnq\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.291021 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-mountpoint-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.291038 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-plugins-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.291094 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98dff839-676a-4268-b3cc-cb4163fb1874-cert\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.291795 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-mountpoint-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.292203 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-plugins-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.292340 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:31.792320189 +0000 UTC m=+143.809760489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.292404 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-csi-data-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.292769 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-serving-cert\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293278 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-socket-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293333 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zw8\" (UniqueName: \"kubernetes.io/projected/98dff839-676a-4268-b3cc-cb4163fb1874-kube-api-access-42zw8\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293377 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc597c79-70a7-489d-b636-5876aa5a0e45-config-volume\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293395 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc597c79-70a7-489d-b636-5876aa5a0e45-metrics-tls\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293421 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v22df\" (UniqueName: \"kubernetes.io/projected/cc597c79-70a7-489d-b636-5876aa5a0e45-kube-api-access-v22df\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293472 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-node-bootstrap-token\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293502 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-registration-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.293620 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-registration-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.294046 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c31e969-aba0-4496-8891-283b9f639973-socket-dir\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.294554 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc597c79-70a7-489d-b636-5876aa5a0e45-config-volume\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.297399 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98dff839-676a-4268-b3cc-cb4163fb1874-cert\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.298749 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.299622 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-certs\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.300135 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.301008 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc597c79-70a7-489d-b636-5876aa5a0e45-metrics-tls\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.301255 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.309576 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7jb\" (UniqueName: \"kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb\") pod \"collect-profiles-29404350-52bhz\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.309804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/478fecf8-ff15-468e-a5f3-4b49e3e28654-etcd-client\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.312599 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.313143 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4a989c69-cca2-452c-987a-ecf2aba3b26d-node-bootstrap-token\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.322436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68w5b\" (UniqueName: \"kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b\") pod \"controller-manager-879f6c89f-8wlxw\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.353635 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.354824 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4vr\" (UniqueName: \"kubernetes.io/projected/478fecf8-ff15-468e-a5f3-4b49e3e28654-kube-api-access-zl4vr\") pod \"etcd-operator-b45778765-xnp9p\" (UID: \"478fecf8-ff15-468e-a5f3-4b49e3e28654\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.360636 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmbx\" (UniqueName: \"kubernetes.io/projected/4442fdf4-4257-4f63-a247-5b2926cc5924-kube-api-access-9qmbx\") pod \"catalog-operator-68c6474976-59sgd\" (UID: \"4442fdf4-4257-4f63-a247-5b2926cc5924\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.370710 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.384266 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.401739 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.402564 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.402939 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:31.902923831 +0000 UTC m=+143.920364131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.411743 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.460287 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.464728 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.504456 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.504733 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.004706153 +0000 UTC m=+144.022146443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.505017 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.505432 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.00541358 +0000 UTC m=+144.022853880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.568936 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zw8\" (UniqueName: \"kubernetes.io/projected/98dff839-676a-4268-b3cc-cb4163fb1874-kube-api-access-42zw8\") pod \"ingress-canary-4rd2p\" (UID: \"98dff839-676a-4268-b3cc-cb4163fb1874\") " pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.568955 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvp78\" (UniqueName: \"kubernetes.io/projected/9c31e969-aba0-4496-8891-283b9f639973-kube-api-access-mvp78\") pod \"csi-hostpathplugin-xl5dh\" (UID: \"9c31e969-aba0-4496-8891-283b9f639973\") " pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.569558 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx25j\" (UniqueName: \"kubernetes.io/projected/4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb-kube-api-access-gx25j\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmv7j\" (UID: \"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.571407 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v22df\" (UniqueName: \"kubernetes.io/projected/cc597c79-70a7-489d-b636-5876aa5a0e45-kube-api-access-v22df\") pod \"dns-default-5clct\" (UID: \"cc597c79-70a7-489d-b636-5876aa5a0e45\") " pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.573164 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" event={"ID":"3ddfcd6f-4387-40b4-9933-4e169797f6da","Type":"ContainerStarted","Data":"5cef5d94677ed11ce78fc2d52b08eb266dc7a004df60f485693499591efde640"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.573212 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" event={"ID":"3ddfcd6f-4387-40b4-9933-4e169797f6da","Type":"ContainerStarted","Data":"e2bc50f55698ecbea50935d7e9fce220a90ee6fba9bd71c2ed5948bd76e52769"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.573500 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.575379 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqnq\" (UniqueName: \"kubernetes.io/projected/4a989c69-cca2-452c-987a-ecf2aba3b26d-kube-api-access-9jqnq\") pod \"machine-config-server-qm5wz\" (UID: \"4a989c69-cca2-452c-987a-ecf2aba3b26d\") " pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.578098 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" event={"ID":"072ec696-c152-40bb-8783-72920846a193","Type":"ContainerStarted","Data":"ae00e9b0ec2df87e2e5db39d496c6f7471a91978f37976d2d035a7c3a6ddec09"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.579673 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" event={"ID":"6e521fb1-0565-4f66-a6f0-1b78942e408e","Type":"ContainerStarted","Data":"2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.584217 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" event={"ID":"6f943860-2a4f-44af-9695-4497a2a8fdd8","Type":"ContainerStarted","Data":"07697d15e39e341c9f5178281c46ed668a1765982d856b8d434e780d22accbeb"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.584278 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" event={"ID":"6f943860-2a4f-44af-9695-4497a2a8fdd8","Type":"ContainerStarted","Data":"d3b03f51866b787d8e2399e10b7a75f62a68e28cc4774e1a1798c260402eaf2e"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.594232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" event={"ID":"ac17f7ac-8454-45e6-af33-a29113eb0d66","Type":"ContainerStarted","Data":"0be82764f4232639284059e3be7284e3dc9a0b20a91868e2adc317d8bcb64d3d"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.597381 4954 generic.go:334] "Generic (PLEG): container finished" podID="0da11e6f-c84f-4d72-83cc-9bb32480b3d2" containerID="f6d30132e3e120e03e055c9af7318c3ed19b8aea431bf333a9585a1e9816580d" exitCode=0 Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.597426 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" event={"ID":"0da11e6f-c84f-4d72-83cc-9bb32480b3d2","Type":"ContainerDied","Data":"f6d30132e3e120e03e055c9af7318c3ed19b8aea431bf333a9585a1e9816580d"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.602704 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-9gfl4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.602761 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" podUID="3ddfcd6f-4387-40b4-9933-4e169797f6da" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.606229 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" event={"ID":"54811547-c0f2-4b3e-8e07-6b6c878d72ee","Type":"ContainerStarted","Data":"f48ad8ad4dcb2b4dbc5e523498ceb28eb4a2adf28f7738b69b3954af4bf9d60f"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.606463 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.606993 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.106974026 +0000 UTC m=+144.124414326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.624178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" event={"ID":"daf9759f-1f7d-4613-b734-a39f4552222e","Type":"ContainerStarted","Data":"77e73a24085ea4aa4c7ffee07f0109774a16944bd7d73ce1fd0a24dab23d122f"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.624223 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" event={"ID":"daf9759f-1f7d-4613-b734-a39f4552222e","Type":"ContainerStarted","Data":"58c3c6859e9f5bd49f8d5e133cd1861990011216694a41ad68a670de73932b9d"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.625470 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2jvzc" event={"ID":"80debf9d-f71d-491f-b914-82597c9d3162","Type":"ContainerStarted","Data":"89aa60fae3644b941e3ac28027628dacd701751bd821fa6bd23268d9c516487d"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.627600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" event={"ID":"9bd46eeb-25a4-4e67-97ad-96c21224fbcd","Type":"ContainerStarted","Data":"784ec0f5611e151d5e90358a0badeb9f20c96ff504456af91aee6519f2aad23d"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.645462 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" event={"ID":"4bfdabf8-f787-45ba-916e-a40db8dd9561","Type":"ContainerStarted","Data":"b9c5c4d4713429351197b1b05ead45630b6c28f68d8e2973647da218dd068545"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.646860 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" event={"ID":"56bfbfaa-4d26-4361-87fc-dab870bdff96","Type":"ContainerStarted","Data":"5cb57900a30e92ead73b25d24319b73f4731a4ae7acd166b5e1af26a5e55f93b"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.700974 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" event={"ID":"5d68d684-47a3-490a-bafb-9c8f04f0d3fb","Type":"ContainerStarted","Data":"954c5be805f7eba3d525bc89a1c1812e209193bfb61d9e6d694269b19dc4d250"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.703611 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htccg" event={"ID":"3e7aebe6-3e4f-498f-a696-5e23f9fe313d","Type":"ContainerStarted","Data":"a96a5a532909543c14b61978a73590410918b7573e697e011a582ddd5a739ed7"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.707662 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" event={"ID":"c9c39245-291c-4f46-88ef-80e78b1c7bae","Type":"ContainerStarted","Data":"73c759de26c5dcc5e29a5b3f8741cdff48adea04d15254f5b325d27a87fe9dc2"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.709137 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.711222 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.211205428 +0000 UTC m=+144.228645728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.712454 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" event={"ID":"916d4ddd-2cd9-4595-a1e1-88f0b3908c95","Type":"ContainerStarted","Data":"5867e39f98c0847ec931dbc4254e3a456302ff93919cf1a90906d758b3fd757b"} Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.713257 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.728060 4954 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6m2df container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.728107 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.792763 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.794952 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.801344 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qm5wz" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.808920 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rd2p" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.811510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.811722 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.311694807 +0000 UTC m=+144.329135107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.812188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.812494 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.312485748 +0000 UTC m=+144.329926048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.816344 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5clct" Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.916712 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.916889 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.416852203 +0000 UTC m=+144.434292503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.917433 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:31 crc kubenswrapper[4954]: E1127 16:40:31.918059 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.418042163 +0000 UTC m=+144.435482463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.926478 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.952835 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.979343 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576"] Nov 27 16:40:31 crc kubenswrapper[4954]: I1127 16:40:31.990704 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m78xr"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.019596 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.020736 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.520708047 +0000 UTC m=+144.538148357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.033347 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.035096 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.122537 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.123005 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.62298187 +0000 UTC m=+144.640422340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.210231 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.216120 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8hmz"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.225370 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.225571 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.725545041 +0000 UTC m=+144.742985341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.225712 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.226075 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.726067904 +0000 UTC m=+144.743508204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.327182 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.327675 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.827659551 +0000 UTC m=+144.845099851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.430762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.431171 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:32.931157995 +0000 UTC m=+144.948598295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.431167 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" podStartSLOduration=124.431151495 podStartE2EDuration="2m4.431151495s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:32.397550732 +0000 UTC m=+144.414991052" watchObservedRunningTime="2025-11-27 16:40:32.431151495 +0000 UTC m=+144.448591795" Nov 27 16:40:32 crc kubenswrapper[4954]: W1127 16:40:32.477557 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a989c69_cca2_452c_987a_ecf2aba3b26d.slice/crio-c066aac41794406ff02ac8e4401d8fd682955d435178a7578ed46fc6a0345a55 WatchSource:0}: Error finding container c066aac41794406ff02ac8e4401d8fd682955d435178a7578ed46fc6a0345a55: Status 404 returned error can't find the container with id c066aac41794406ff02ac8e4401d8fd682955d435178a7578ed46fc6a0345a55 Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.490757 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cc84q"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.493852 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.499160 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.536991 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.537568 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.037536901 +0000 UTC m=+145.054977201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.640041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.641281 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.141263921 +0000 UTC m=+145.158704211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.656063 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" podStartSLOduration=124.656025261 podStartE2EDuration="2m4.656025261s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:32.652499053 +0000 UTC m=+144.669939353" watchObservedRunningTime="2025-11-27 16:40:32.656025261 +0000 UTC m=+144.673465561" Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.742694 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.743136 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.243078683 +0000 UTC m=+145.260518983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.746462 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.761379 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.261351452 +0000 UTC m=+145.278791752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.766348 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.795870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" event={"ID":"9a282fc0-c31c-440e-ae60-555e7e9aea66","Type":"ContainerStarted","Data":"ddcc464f86c617902c7a8be549aa46148aa7264aef1ef90275b2086e0e19767c"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.807546 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.819982 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" event={"ID":"369df47c-55e0-41da-bb67-b99bb189b870","Type":"ContainerStarted","Data":"584425cb9f98c6be58ef57297cb35adae4c7f43606fb99695b273f91ab5847d5"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.823416 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnp9p"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.836064 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz"] Nov 27 16:40:32 crc kubenswrapper[4954]: W1127 16:40:32.839840 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f13cd59_b0f9_4562_a20b_d3d8f4bca5bb.slice/crio-328862df51f66ee5bd9800a18c24864f2e84c523a37a0a6f3e84a688f2feb6b8 WatchSource:0}: Error finding container 328862df51f66ee5bd9800a18c24864f2e84c523a37a0a6f3e84a688f2feb6b8: Status 404 returned error can't find the container with id 328862df51f66ee5bd9800a18c24864f2e84c523a37a0a6f3e84a688f2feb6b8 Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.840754 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" event={"ID":"916d4ddd-2cd9-4595-a1e1-88f0b3908c95","Type":"ContainerStarted","Data":"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.840803 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.877907 4954 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6m2df container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.877969 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.885883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.888012 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.387977135 +0000 UTC m=+145.405417435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.907954 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" event={"ID":"b335decc-f67f-47e1-bee9-8d3033151b92","Type":"ContainerStarted","Data":"8b49e6a422558a4088dfab0b7c498ad7f0739267d2888f124cd9a7a26aac0d16"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.912335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" event={"ID":"072ec696-c152-40bb-8783-72920846a193","Type":"ContainerStarted","Data":"1c5b9ee2e53ab126bae8dc392cfed7c82623f76ae03d8179dd3626fe5ee1037c"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.915595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" event={"ID":"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf","Type":"ContainerStarted","Data":"3bb8d1d9adabc405406c70f435f94091fcb0b8ad434daf88f2933bb2111b3f47"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.927122 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.933217 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s8cm2" event={"ID":"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc","Type":"ContainerStarted","Data":"058d075931949c57176ad3d1cd47955163a6467499a7b25bf5dd0cbcbb7e056f"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.938205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" event={"ID":"34886065-3f55-42b2-820f-13b4d921fb85","Type":"ContainerStarted","Data":"d108059255634ca92ef4da15b1eed46ec9f094e5e84a6d3f6a03a0c638e289e9"} Nov 27 16:40:32 crc kubenswrapper[4954]: W1127 16:40:32.953631 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc597c79_70a7_489d_b636_5876aa5a0e45.slice/crio-4f955d52668a65934144967007a1a97b4e8b3f6167c4406a5a67fc9103f02a08 WatchSource:0}: Error finding container 4f955d52668a65934144967007a1a97b4e8b3f6167c4406a5a67fc9103f02a08: Status 404 returned error can't find the container with id 4f955d52668a65934144967007a1a97b4e8b3f6167c4406a5a67fc9103f02a08 Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.959632 4954 generic.go:334] "Generic (PLEG): container finished" podID="5d68d684-47a3-490a-bafb-9c8f04f0d3fb" containerID="8f4a90d18aa470d0d32e2275f6d702c675a5428a774d86dd0a5f3c95dd7f4120" exitCode=0 Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.959864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" event={"ID":"5d68d684-47a3-490a-bafb-9c8f04f0d3fb","Type":"ContainerDied","Data":"8f4a90d18aa470d0d32e2275f6d702c675a5428a774d86dd0a5f3c95dd7f4120"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.961956 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5clct"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.976046 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rd2p"] Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.984627 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9v6k" podStartSLOduration=123.984604397 podStartE2EDuration="2m3.984604397s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:32.953344664 +0000 UTC m=+144.970784984" watchObservedRunningTime="2025-11-27 16:40:32.984604397 +0000 UTC m=+145.002044697" Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.986913 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" event={"ID":"ac17f7ac-8454-45e6-af33-a29113eb0d66","Type":"ContainerStarted","Data":"e51e8ae39efc1ac71afa658ab1c9d0cd88cf6a8372a2921eba7b40bd35d87dc8"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.988127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:32 crc kubenswrapper[4954]: E1127 16:40:32.990901 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.490880124 +0000 UTC m=+145.508320424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.991212 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" event={"ID":"4bfdabf8-f787-45ba-916e-a40db8dd9561","Type":"ContainerStarted","Data":"bf914977e28639b80d26c945eac9ec07100bafe10c4c9b150888940d686faedd"} Nov 27 16:40:32 crc kubenswrapper[4954]: I1127 16:40:32.997240 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" event={"ID":"58db67ba-0f90-4190-8beb-02489a6e6a1a","Type":"ContainerStarted","Data":"b524c2d2f1ca6c2859e6722382f9778dfa7c7d115c3f502ed1760820bf8cbbd4"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.003196 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" event={"ID":"9bd46eeb-25a4-4e67-97ad-96c21224fbcd","Type":"ContainerStarted","Data":"326f9fab5c31c5d697f5eff1ae09b31e5748e5968005754fe954be4e16de1c6d"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.006782 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" event={"ID":"7ac0eaee-22bb-4194-bb62-0622b65c778b","Type":"ContainerStarted","Data":"14c7a424d86fb8575d70f6ac1d2ef70d25259e69ef63734f2cb4e7ff70f79c4f"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.013371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" event={"ID":"ce757d32-af85-4142-9beb-95ac115d61d7","Type":"ContainerStarted","Data":"89e6756a6f531fc8eab74ab36e9191fff730009da274d77f480c5c9bba77d216"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.017171 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerStarted","Data":"ed1a0de66b1b47772a8c20e0f6bf4d953b3f42ac4f7572ce4541c9394e166e2a"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.018311 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" event={"ID":"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab","Type":"ContainerStarted","Data":"6b676017ddb5cd3907ec220282f85128e2f356623dd03d377743e8336302c711"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.019378 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2jvzc" event={"ID":"80debf9d-f71d-491f-b914-82597c9d3162","Type":"ContainerStarted","Data":"ec8c7314643a2782b4b8eeb62825cde641f4a4bb7861ba7ff5e8ee7f6a46f359"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.021272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" event={"ID":"58eb1d01-1f82-43fa-8ace-86368d05ec71","Type":"ContainerStarted","Data":"da83689aa49fad4c9e9364dda412a05f765bf9f30121e1e2653e8e5afb770d67"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.022496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" event={"ID":"54811547-c0f2-4b3e-8e07-6b6c878d72ee","Type":"ContainerStarted","Data":"e8caeb7dc5c6705e6b23be79333f96c90840fd6abd20fb09b3e06f78a53cf255"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.035790 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl5dh"] Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.042333 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" podStartSLOduration=125.042315573 podStartE2EDuration="2m5.042315573s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.03818634 +0000 UTC m=+145.055626640" watchObservedRunningTime="2025-11-27 16:40:33.042315573 +0000 UTC m=+145.059755873" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.054022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-62hrj" event={"ID":"c9c39245-291c-4f46-88ef-80e78b1c7bae","Type":"ContainerStarted","Data":"d73ae080c708749d439612801bc4df2f39f62a59e577745ca9170a44fa619a1e"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.061108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" event={"ID":"a6bd5d5a-d026-46f4-8467-993d9a1a3a59","Type":"ContainerStarted","Data":"015810f75f6dfc768eda2663883015df77c08d3b2e2aec21496c10f19e683df7"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.061175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" event={"ID":"a6bd5d5a-d026-46f4-8467-993d9a1a3a59","Type":"ContainerStarted","Data":"850fa303a0e48e4d675fbe2ee7b1faac1cd899cbd29002fafa3f1d4f54a4ac79"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.062279 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.065696 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" event={"ID":"56bfbfaa-4d26-4361-87fc-dab870bdff96","Type":"ContainerStarted","Data":"550fc7acf212783805134d348088956e145dcb4fe8547f342a6cfd8a90c39170"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.065803 4954 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-f44h7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.065846 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" podUID="a6bd5d5a-d026-46f4-8467-993d9a1a3a59" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.067636 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" event={"ID":"daf9759f-1f7d-4613-b734-a39f4552222e","Type":"ContainerStarted","Data":"0149382210da2f67768dfc426c133cc7e3c168b16b67863f269b583212b95a85"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.069768 4954 generic.go:334] "Generic (PLEG): container finished" podID="3e7aebe6-3e4f-498f-a696-5e23f9fe313d" containerID="ed768899b59d99d44755e40e6b60e14f18b333cd9c3f04f309ff952704b80f3f" exitCode=0 Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.069822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htccg" event={"ID":"3e7aebe6-3e4f-498f-a696-5e23f9fe313d","Type":"ContainerDied","Data":"ed768899b59d99d44755e40e6b60e14f18b333cd9c3f04f309ff952704b80f3f"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.075923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" event={"ID":"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd","Type":"ContainerStarted","Data":"3d9e89bcc9628fc7708754474d77fab547d3d6f94ff106333a8f16f361953acc"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.079029 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.085478 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.086002 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.089073 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.589044585 +0000 UTC m=+145.606484885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.089108 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.089451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.090006 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.589994208 +0000 UTC m=+145.607434508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.093247 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" event={"ID":"0da11e6f-c84f-4d72-83cc-9bb32480b3d2","Type":"ContainerStarted","Data":"a83b3cdb5557de4523c666d5018a494491d8c9b6adc8adc7d4d62013e921a79b"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.094134 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.097091 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" event={"ID":"6e521fb1-0565-4f66-a6f0-1b78942e408e","Type":"ContainerStarted","Data":"f467d62914eade0f151113915f0669ca492deef458ab407c5bef188eaf9a166c"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.097955 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.113090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qm5wz" event={"ID":"4a989c69-cca2-452c-987a-ecf2aba3b26d","Type":"ContainerStarted","Data":"c066aac41794406ff02ac8e4401d8fd682955d435178a7578ed46fc6a0345a55"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.116031 4954 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-f6f2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.116102 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.141769 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-9gfl4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.141831 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" podUID="3ddfcd6f-4387-40b4-9933-4e169797f6da" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.140848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m78xr" event={"ID":"6606df87-becb-460d-8579-22c5eb23e71a","Type":"ContainerStarted","Data":"1300023fea17821b6b72b4467f9a2c226abdc670657105b7612f4ad7f10f53b5"} Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.197363 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.198850 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.698833157 +0000 UTC m=+145.716273457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.236289 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xqdst" podStartSLOduration=125.236267125 podStartE2EDuration="2m5.236267125s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.230349586 +0000 UTC m=+145.247789886" watchObservedRunningTime="2025-11-27 16:40:33.236267125 +0000 UTC m=+145.253707425" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.299601 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.299957 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.799943711 +0000 UTC m=+145.817384021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.382019 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-s8cm2" podStartSLOduration=124.382001578 podStartE2EDuration="2m4.382001578s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.379522736 +0000 UTC m=+145.396963036" watchObservedRunningTime="2025-11-27 16:40:33.382001578 +0000 UTC m=+145.399441878" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.400648 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.400794 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.900776138 +0000 UTC m=+145.918216428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.401162 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.401401 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:33.901394354 +0000 UTC m=+145.918834654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.484275 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hqqc6" podStartSLOduration=124.484258321 podStartE2EDuration="2m4.484258321s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.483232385 +0000 UTC m=+145.500672685" watchObservedRunningTime="2025-11-27 16:40:33.484258321 +0000 UTC m=+145.501698621" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.544659 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.545163 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.045139497 +0000 UTC m=+146.062579797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.646380 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.646868 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.146850307 +0000 UTC m=+146.164290607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.674330 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2jvzc" podStartSLOduration=124.674307155 podStartE2EDuration="2m4.674307155s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.62227789 +0000 UTC m=+145.639718210" watchObservedRunningTime="2025-11-27 16:40:33.674307155 +0000 UTC m=+145.691747455" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.752745 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.753214 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.253199882 +0000 UTC m=+146.270640182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.864942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.865643 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.36562561 +0000 UTC m=+146.383065910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.892864 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" podStartSLOduration=124.892559965 podStartE2EDuration="2m4.892559965s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.861365523 +0000 UTC m=+145.878805823" watchObservedRunningTime="2025-11-27 16:40:33.892559965 +0000 UTC m=+145.910000265" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.897755 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-h48pg" podStartSLOduration=124.897743585 podStartE2EDuration="2m4.897743585s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.890800061 +0000 UTC m=+145.908240361" watchObservedRunningTime="2025-11-27 16:40:33.897743585 +0000 UTC m=+145.915183885" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.943254 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" podStartSLOduration=125.943227046 podStartE2EDuration="2m5.943227046s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.942091097 +0000 UTC m=+145.959531397" watchObservedRunningTime="2025-11-27 16:40:33.943227046 +0000 UTC m=+145.960667346" Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.976990 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.977396 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.477358001 +0000 UTC m=+146.494798311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.977716 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:33 crc kubenswrapper[4954]: E1127 16:40:33.978719 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.478705465 +0000 UTC m=+146.496145765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:33 crc kubenswrapper[4954]: I1127 16:40:33.987904 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" podStartSLOduration=124.987873995 podStartE2EDuration="2m4.987873995s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:33.975072053 +0000 UTC m=+145.992512353" watchObservedRunningTime="2025-11-27 16:40:33.987873995 +0000 UTC m=+146.005314305" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.081099 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.081974 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.581951373 +0000 UTC m=+146.599391673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.109824 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:34 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:34 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:34 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.109930 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.182790 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.183320 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.683292983 +0000 UTC m=+146.700733463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.270999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m78xr" event={"ID":"6606df87-becb-460d-8579-22c5eb23e71a","Type":"ContainerStarted","Data":"aa9d257b7fc730af0da859de119eefad637d5761418f0395ade306fa159f7c75"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.271444 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.273226 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.273275 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.278668 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" event={"ID":"478fecf8-ff15-468e-a5f3-4b49e3e28654","Type":"ContainerStarted","Data":"121589e6d0a18f0d376f36ba57342f551f08462b3388611305d96a54ff5a8888"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.290038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.290570 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.790550002 +0000 UTC m=+146.807990302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.293644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" event={"ID":"58eb1d01-1f82-43fa-8ace-86368d05ec71","Type":"ContainerStarted","Data":"80865b796b7d3faffdac2955cc27b14623ceb8f0d1fa4bb287ae233a617260b2"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.295189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5clct" event={"ID":"cc597c79-70a7-489d-b636-5876aa5a0e45","Type":"ContainerStarted","Data":"4f955d52668a65934144967007a1a97b4e8b3f6167c4406a5a67fc9103f02a08"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.346378 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bp7nq" podStartSLOduration=125.34635262 podStartE2EDuration="2m5.34635262s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.344890883 +0000 UTC m=+146.362331193" watchObservedRunningTime="2025-11-27 16:40:34.34635262 +0000 UTC m=+146.363792930" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.359797 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-m78xr" podStartSLOduration=125.359766616 podStartE2EDuration="2m5.359766616s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.305204338 +0000 UTC m=+146.322644638" watchObservedRunningTime="2025-11-27 16:40:34.359766616 +0000 UTC m=+146.377206906" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.396732 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" event={"ID":"4442fdf4-4257-4f63-a247-5b2926cc5924","Type":"ContainerStarted","Data":"40c058a1ab60c61afe8729a2f92b4bd498043c01db0e1b23a2407928f564bc40"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.396818 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" event={"ID":"4442fdf4-4257-4f63-a247-5b2926cc5924","Type":"ContainerStarted","Data":"098c9a45a3dfcd0db931504892355dc2d8a034d0660e2ac86f895ec29b5d8bc1"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.397446 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.400513 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.408206 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:34.90818171 +0000 UTC m=+146.925622010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.422956 4954 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-59sgd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.423026 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" podUID="4442fdf4-4257-4f63-a247-5b2926cc5924" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.505015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" event={"ID":"56bfbfaa-4d26-4361-87fc-dab870bdff96","Type":"ContainerStarted","Data":"60712fb480a79001125dd073a412363d0b06a76a57a38771d7bef6af4c2c6cca"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.510969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.512643 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.012599237 +0000 UTC m=+147.030039537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.549092 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" podStartSLOduration=125.549075421 podStartE2EDuration="2m5.549075421s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.480358249 +0000 UTC m=+146.497798549" watchObservedRunningTime="2025-11-27 16:40:34.549075421 +0000 UTC m=+146.566515721" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.550678 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mn7g4" podStartSLOduration=126.550672171 podStartE2EDuration="2m6.550672171s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.549850881 +0000 UTC m=+146.567291191" watchObservedRunningTime="2025-11-27 16:40:34.550672171 +0000 UTC m=+146.568112471" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.583501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" event={"ID":"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa","Type":"ContainerStarted","Data":"2cd9c14dc50069203d92f84d7626f7c9a0ac759f9713903594d5e1224b4d9f0b"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.599642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" event={"ID":"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf","Type":"ContainerStarted","Data":"5e3cb32ac1ff41af0977067e836fdbca77bee8dd5454245310fa42cda251cbd1"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.613243 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.619551 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.119531948 +0000 UTC m=+147.136972248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.636752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" event={"ID":"4bfdabf8-f787-45ba-916e-a40db8dd9561","Type":"ContainerStarted","Data":"67e83d01435d8de31aa75eb7de534bbf534312a1def3dda399a8cad1326a28e0"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.716575 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.720424 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.220388996 +0000 UTC m=+147.237829296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.745417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" event={"ID":"9a282fc0-c31c-440e-ae60-555e7e9aea66","Type":"ContainerStarted","Data":"18810e525aa3158a3401c13ab37cfb2a95440bd10d3474942a7120a7b18b7e93"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.752419 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" podStartSLOduration=126.752399518 podStartE2EDuration="2m6.752399518s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.647013996 +0000 UTC m=+146.664454286" watchObservedRunningTime="2025-11-27 16:40:34.752399518 +0000 UTC m=+146.769839818" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.754558 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" event={"ID":"24f553f1-7b7b-4d3e-addf-2b5d1039f176","Type":"ContainerStarted","Data":"10568d07ed2af5fafeeec0a95d7590fcc65eebb5b22bb8f0a64a67abe1e7fb30"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.755769 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.779821 4954 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8wlxw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.779890 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.784977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" event={"ID":"7ac0eaee-22bb-4194-bb62-0622b65c778b","Type":"ContainerStarted","Data":"4d0b4fd2d3181d9bf26c047a5ccc826b5dc7f72f9b53df451b97c42128eab703"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.823452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.826296 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.326282799 +0000 UTC m=+147.343723099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.871221 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fh26g" podStartSLOduration=125.871201545 podStartE2EDuration="2m5.871201545s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.869991616 +0000 UTC m=+146.887431916" watchObservedRunningTime="2025-11-27 16:40:34.871201545 +0000 UTC m=+146.888641845" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.871429 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-prlg4" podStartSLOduration=125.871424592 podStartE2EDuration="2m5.871424592s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.754030039 +0000 UTC m=+146.771470339" watchObservedRunningTime="2025-11-27 16:40:34.871424592 +0000 UTC m=+146.888864892" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.880559 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" event={"ID":"369df47c-55e0-41da-bb67-b99bb189b870","Type":"ContainerStarted","Data":"54f565ca7daa6f515a7826dce5a13722dfe1df23b2c562897aa2846277ab91c5"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.924838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" event={"ID":"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab","Type":"ContainerStarted","Data":"b0d8f5ee463fba4aca98101756ef4d13515fd9a5a1c125830a0dbb4f9594c38e"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.924888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" event={"ID":"2fe78cc8-8ce0-4cdf-9dcf-a15624194cab","Type":"ContainerStarted","Data":"8e4b26aa4ce677efd4b4100681ec9814eb07d5dd3e0d7485daba9e676637853a"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.925594 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.926782 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:34 crc kubenswrapper[4954]: E1127 16:40:34.928062 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.42804418 +0000 UTC m=+147.445484480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.949158 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" event={"ID":"25fce237-9917-431e-b345-ef24a715fd12","Type":"ContainerStarted","Data":"bcaf759e68cddbe1b10560143f22c6aa3da8a0d4540973b41136d63cc7c62665"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.949206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" event={"ID":"25fce237-9917-431e-b345-ef24a715fd12","Type":"ContainerStarted","Data":"58bba95dda510bd5a88a062f0179aea2c5e5a790cb9d19d53b62610f10b258b0"} Nov 27 16:40:34 crc kubenswrapper[4954]: I1127 16:40:34.989732 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f9qbf" podStartSLOduration=125.989712466 podStartE2EDuration="2m5.989712466s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.948799191 +0000 UTC m=+146.966239491" watchObservedRunningTime="2025-11-27 16:40:34.989712466 +0000 UTC m=+147.007152776" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.028611 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.029911 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.529893584 +0000 UTC m=+147.547333884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.046329 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" event={"ID":"072ec696-c152-40bb-8783-72920846a193","Type":"ContainerStarted","Data":"c7bb09f498777120be254872d983b925497fbd530169b913244d1c6092400ebc"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.050658 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" event={"ID":"2e80b2c6-58fd-466e-a5fe-d16bf1f4c7bd","Type":"ContainerStarted","Data":"1bccec2b27b1b1de6b944e3084ba030fe8facf7779b505ddd12661ae38a4b1d7"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.062008 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" podStartSLOduration=126.061991868 podStartE2EDuration="2m6.061991868s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:34.990105646 +0000 UTC m=+147.007545946" watchObservedRunningTime="2025-11-27 16:40:35.061991868 +0000 UTC m=+147.079432168" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.063616 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tj896" podStartSLOduration=127.063609109 podStartE2EDuration="2m7.063609109s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.061162687 +0000 UTC m=+147.078602987" watchObservedRunningTime="2025-11-27 16:40:35.063609109 +0000 UTC m=+147.081049409" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.083924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" event={"ID":"ce757d32-af85-4142-9beb-95ac115d61d7","Type":"ContainerStarted","Data":"724599edb92812c73b23d08f07d15bbb1462cc51918b17073860b3f8568b2715"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.085210 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.101644 4954 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cv9bx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.101717 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" podUID="ce757d32-af85-4142-9beb-95ac115d61d7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.105058 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8svq7" podStartSLOduration=126.105048257 podStartE2EDuration="2m6.105048257s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.102733099 +0000 UTC m=+147.120173399" watchObservedRunningTime="2025-11-27 16:40:35.105048257 +0000 UTC m=+147.122488557" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.110962 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:35 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:35 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:35 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.111054 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.141694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" event={"ID":"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb","Type":"ContainerStarted","Data":"328862df51f66ee5bd9800a18c24864f2e84c523a37a0a6f3e84a688f2feb6b8"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.142921 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.144455 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.644433945 +0000 UTC m=+147.661874245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.157873 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" event={"ID":"b335decc-f67f-47e1-bee9-8d3033151b92","Type":"ContainerStarted","Data":"10e43175b974aaadb44b132dea3c8d2ce0e1ea94188351c35bf49c6457ef519a"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.163673 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" podStartSLOduration=126.163645896 podStartE2EDuration="2m6.163645896s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.157520642 +0000 UTC m=+147.174960952" watchObservedRunningTime="2025-11-27 16:40:35.163645896 +0000 UTC m=+147.181086186" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.194963 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerStarted","Data":"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.195484 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.196681 4954 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qmz7n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.196718 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.207261 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" event={"ID":"9c31e969-aba0-4496-8891-283b9f639973","Type":"ContainerStarted","Data":"505cfd691dfb3441a9860a2c96afae200cbccf183fdaab4435cdd8b5ce4949c4"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.214024 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfnws" podStartSLOduration=126.214009128 podStartE2EDuration="2m6.214009128s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.181795911 +0000 UTC m=+147.199236211" watchObservedRunningTime="2025-11-27 16:40:35.214009128 +0000 UTC m=+147.231449428" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.215814 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t8hmz" podStartSLOduration=126.215803744 podStartE2EDuration="2m6.215803744s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.2129145 +0000 UTC m=+147.230354800" watchObservedRunningTime="2025-11-27 16:40:35.215803744 +0000 UTC m=+147.233244044" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.222059 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s8cm2" event={"ID":"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc","Type":"ContainerStarted","Data":"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.265724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.266142 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.766124844 +0000 UTC m=+147.783565144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.280651 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" event={"ID":"58db67ba-0f90-4190-8beb-02489a6e6a1a","Type":"ContainerStarted","Data":"2ba858124177153d4d12d2437e1c52572d8890e6e22557bb41a9d3c7b124dd94"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.292084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qm5wz" event={"ID":"4a989c69-cca2-452c-987a-ecf2aba3b26d","Type":"ContainerStarted","Data":"28d73f390af9b71cd2ed5c3aaa999645340cbda21f3f7254a1667dbb4ad8c017"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.317510 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" podStartSLOduration=126.317484492 podStartE2EDuration="2m6.317484492s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.291997483 +0000 UTC m=+147.309437783" watchObservedRunningTime="2025-11-27 16:40:35.317484492 +0000 UTC m=+147.334924792" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.372131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.372800 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" podStartSLOduration=126.372766808 podStartE2EDuration="2m6.372766808s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.343121324 +0000 UTC m=+147.360561624" watchObservedRunningTime="2025-11-27 16:40:35.372766808 +0000 UTC m=+147.390207098" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.374126 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.874100521 +0000 UTC m=+147.891540971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.377094 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kpmsg" podStartSLOduration=126.377083466 podStartE2EDuration="2m6.377083466s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.372405158 +0000 UTC m=+147.389845448" watchObservedRunningTime="2025-11-27 16:40:35.377083466 +0000 UTC m=+147.394523776" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.393695 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rd2p" event={"ID":"98dff839-676a-4268-b3cc-cb4163fb1874","Type":"ContainerStarted","Data":"d68ab249fa1d52e34a9909a292b732dbb202188da05c3f1c0b377b8db3b159f1"} Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.404088 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.414895 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.431510 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qm5wz" podStartSLOduration=7.43148251 podStartE2EDuration="7.43148251s" podCreationTimestamp="2025-11-27 16:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.43073346 +0000 UTC m=+147.448173760" watchObservedRunningTime="2025-11-27 16:40:35.43148251 +0000 UTC m=+147.448922810" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.443441 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f44h7" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.460070 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" podStartSLOduration=126.460053145 podStartE2EDuration="2m6.460053145s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.457341577 +0000 UTC m=+147.474781877" watchObservedRunningTime="2025-11-27 16:40:35.460053145 +0000 UTC m=+147.477493445" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.475510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.483966 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:35.983945314 +0000 UTC m=+148.001385614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.495606 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" podStartSLOduration=126.495571615 podStartE2EDuration="2m6.495571615s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.492841257 +0000 UTC m=+147.510281557" watchObservedRunningTime="2025-11-27 16:40:35.495571615 +0000 UTC m=+147.513011915" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.581199 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.581876 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.081853728 +0000 UTC m=+148.099294028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.652444 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4rd2p" podStartSLOduration=7.652421127 podStartE2EDuration="7.652421127s" podCreationTimestamp="2025-11-27 16:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:35.603620693 +0000 UTC m=+147.621061003" watchObservedRunningTime="2025-11-27 16:40:35.652421127 +0000 UTC m=+147.669861427" Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.683413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.683946 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.183929587 +0000 UTC m=+148.201369887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.784893 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.785276 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.285260317 +0000 UTC m=+148.302700617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.886818 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.887241 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.387224922 +0000 UTC m=+148.404665222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.988674 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:35 crc kubenswrapper[4954]: I1127 16:40:35.989035 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:35 crc kubenswrapper[4954]: E1127 16:40:35.990247 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.490218714 +0000 UTC m=+148.507659014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:35.995548 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.086874 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:36 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:36 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:36 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.086959 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.090911 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.091023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.091067 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.091101 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.091512 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.591495113 +0000 UTC m=+148.608935413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.097665 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.100026 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.100099 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.105775 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ltms" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.183889 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.193996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.198101 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.198509 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.698493274 +0000 UTC m=+148.715933564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.221259 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.302621 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.303461 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.803429455 +0000 UTC m=+148.820869895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.409389 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.409890 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:36.909867563 +0000 UTC m=+148.927307853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.432589 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" event={"ID":"ba3f0ebc-a2f7-42bc-9ad9-89a12081fdcf","Type":"ContainerStarted","Data":"f8c68e4f09aa8dbb3a7e43b913eee47be00735353328f8857c7a372cd9ab6e5d"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.472439 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nngrv" podStartSLOduration=127.47240082 podStartE2EDuration="2m7.47240082s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:36.465223301 +0000 UTC m=+148.482663601" watchObservedRunningTime="2025-11-27 16:40:36.47240082 +0000 UTC m=+148.489841120" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.482257 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6h576" event={"ID":"58db67ba-0f90-4190-8beb-02489a6e6a1a","Type":"ContainerStarted","Data":"c6c7e504ec4917ae0a4e4e173a4884e86c284595bb7f470175e06d915bd5d86d"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.511767 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.512473 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.012450065 +0000 UTC m=+149.029890365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.528438 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rd2p" event={"ID":"98dff839-676a-4268-b3cc-cb4163fb1874","Type":"ContainerStarted","Data":"563dd40f40f0d9fe8af912ea20709b356fbb071ac4d60b76ad4a9bb2e7d1ea23"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.546155 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5clct" event={"ID":"cc597c79-70a7-489d-b636-5876aa5a0e45","Type":"ContainerStarted","Data":"7ca8f71417ce82c0a13ab455d499290501e676dbb1607c727a5df93803e569b3"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.546224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5clct" event={"ID":"cc597c79-70a7-489d-b636-5876aa5a0e45","Type":"ContainerStarted","Data":"6217e3cdb0c32ddd601615c78fe14515ffe792c4b543306d097c02f4d424a6a5"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.546246 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5clct" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.563680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" event={"ID":"ac17f7ac-8454-45e6-af33-a29113eb0d66","Type":"ContainerStarted","Data":"b9ab09602cfc7f0dae3c29c413aeb0567e02614e0e73d7c7b6bc01fdd5f8ddaa"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.579032 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" event={"ID":"5d68d684-47a3-490a-bafb-9c8f04f0d3fb","Type":"ContainerStarted","Data":"c5ec4feb7302ca000e7c7c93fc00115f95b45128752d7f1eaa4f4ed44b479153"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.605346 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5clct" podStartSLOduration=8.605317342 podStartE2EDuration="8.605317342s" podCreationTimestamp="2025-11-27 16:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:36.597217009 +0000 UTC m=+148.614657329" watchObservedRunningTime="2025-11-27 16:40:36.605317342 +0000 UTC m=+148.622757642" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.609525 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htccg" event={"ID":"3e7aebe6-3e4f-498f-a696-5e23f9fe313d","Type":"ContainerStarted","Data":"c56bec4ca4c848846543e75ea3f21893f0d32738a0211da0205f5f426c439c09"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.609600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htccg" event={"ID":"3e7aebe6-3e4f-498f-a696-5e23f9fe313d","Type":"ContainerStarted","Data":"eff551c70c47bfbac1ddde4c0ad485e14b4c1ce41a01a70606676f616ed48fbe"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.612429 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.615341 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.115319923 +0000 UTC m=+149.132760223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.637788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" event={"ID":"9c31e969-aba0-4496-8891-283b9f639973","Type":"ContainerStarted","Data":"002b6185e5c104a8498f14643a0f86942cb1e70c5d5f0b7074e237fe45cf2119"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.691784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" event={"ID":"478fecf8-ff15-468e-a5f3-4b49e3e28654","Type":"ContainerStarted","Data":"13d04c24891284fecc591c7c0bc7febe8f55c4c4e3f69c672f60e87e2ef884c7"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.715670 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.719978 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.219960236 +0000 UTC m=+149.237400526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.720748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmv7j" event={"ID":"4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb","Type":"ContainerStarted","Data":"33cd71f9ca8365a647b73f1420691c201f86e1535e2d76841b239150d0daf96a"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.759525 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" event={"ID":"34886065-3f55-42b2-820f-13b4d921fb85","Type":"ContainerStarted","Data":"72a4478cb3f9bc0336f8e7e7ddfedb5bada9cc0f3b3de73cf6c64e0ad02e7eaa"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.759599 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" event={"ID":"34886065-3f55-42b2-820f-13b4d921fb85","Type":"ContainerStarted","Data":"f9d534a8d23f5eff8ad23b2bc1d1b816e915cf0c071454340be7893f411d89e7"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.773456 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mjllc" podStartSLOduration=127.773432196 podStartE2EDuration="2m7.773432196s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:36.684921928 +0000 UTC m=+148.702362228" watchObservedRunningTime="2025-11-27 16:40:36.773432196 +0000 UTC m=+148.790872496" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.787812 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" event={"ID":"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa","Type":"ContainerStarted","Data":"9d834dbdd90a2ed8601aa0cf2877e09ac939740a736753205e09592479fa4681"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.807369 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" event={"ID":"24f553f1-7b7b-4d3e-addf-2b5d1039f176","Type":"ContainerStarted","Data":"ce7951a9306b662396c84e314cad126080d4ed8fb027a5c3883f10c25c66cea7"} Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.808971 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.809039 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.818713 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.821511 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-59sgd" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.824280 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.821629 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.321610063 +0000 UTC m=+149.339050363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.824402 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.826146 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.326132447 +0000 UTC m=+149.343572747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.841139 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" podStartSLOduration=127.841113412 podStartE2EDuration="2m7.841113412s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:36.82108792 +0000 UTC m=+148.838528220" watchObservedRunningTime="2025-11-27 16:40:36.841113412 +0000 UTC m=+148.858553712" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.850140 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.866828 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cv9bx" Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.926212 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:36 crc kubenswrapper[4954]: E1127 16:40:36.927569 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.427545789 +0000 UTC m=+149.444986089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:36 crc kubenswrapper[4954]: I1127 16:40:36.973175 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-htccg" podStartSLOduration=128.973148912 podStartE2EDuration="2m8.973148912s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:36.963860259 +0000 UTC m=+148.981300559" watchObservedRunningTime="2025-11-27 16:40:36.973148912 +0000 UTC m=+148.990589212" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.029734 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.030171 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.53015165 +0000 UTC m=+149.547591950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.092294 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:37 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:37 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:37 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.092405 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.131058 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.131542 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.631519142 +0000 UTC m=+149.648959442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.186511 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cc84q" podStartSLOduration=128.186490459 podStartE2EDuration="2m8.186490459s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:37.116834753 +0000 UTC m=+149.134275043" watchObservedRunningTime="2025-11-27 16:40:37.186490459 +0000 UTC m=+149.203930759" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.234337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.234748 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.734736178 +0000 UTC m=+149.752176478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.235059 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xnp9p" podStartSLOduration=128.235030946 podStartE2EDuration="2m8.235030946s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:37.233510888 +0000 UTC m=+149.250951208" watchObservedRunningTime="2025-11-27 16:40:37.235030946 +0000 UTC m=+149.252471246" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.248088 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.249215 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.265095 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.285524 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.335101 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.335456 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl6wz\" (UniqueName: \"kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.335484 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.335505 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.335648 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.835629308 +0000 UTC m=+149.853069608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.397364 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.400662 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.407800 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.413018 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.438222 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl6wz\" (UniqueName: \"kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.438270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.438295 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.438338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.438680 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:37.93866703 +0000 UTC m=+149.956107330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.439426 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.439724 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.499731 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl6wz\" (UniqueName: \"kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz\") pod \"certified-operators-wdwtv\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.542378 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.542756 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8v4w\" (UniqueName: \"kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.542798 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.542829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.542973 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.042952015 +0000 UTC m=+150.060392315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.576898 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.598491 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.599478 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.602919 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.645314 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.645386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8v4w\" (UniqueName: \"kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.645419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.645452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.645886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.646105 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.646357 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.146343745 +0000 UTC m=+150.163784045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.696414 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8v4w\" (UniqueName: \"kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w\") pod \"community-operators-4jtcn\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.729070 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.745917 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.746180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.746216 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csk6m\" (UniqueName: \"kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.746248 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.746385 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.246369703 +0000 UTC m=+150.263810003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.776379 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.777948 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.786224 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850345 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850413 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwsbm\" (UniqueName: \"kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csk6m\" (UniqueName: \"kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.850964 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.851225 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.351213631 +0000 UTC m=+150.368653931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.851567 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.871282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2d035ea709383e170a1ad2fb5c901005e588ef632239c7b39c1c2822d8bf2386"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.871349 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f1c4f689794a693e8e5f5f89a78351ec1b32cd6bab3bc6967694cef80584315d"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.872472 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.898592 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" event={"ID":"9c31e969-aba0-4496-8891-283b9f639973","Type":"ContainerStarted","Data":"9405bd04a0d73dc44a7e9f8ab8c990ec0f754efb23f9705898cab5e0da6e3792"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.899662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csk6m\" (UniqueName: \"kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m\") pod \"certified-operators-gflq5\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.923238 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb8c99775335b2d8d6e96f493323bb4049246394ab23881edc2b956937f846c5"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.923285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dbf216946224ec4c8f9b1dea7d6e575100cb9cca6a8a798a80d0c9dd3aca9046"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.987619 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.988131 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.988263 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.488234076 +0000 UTC m=+150.505674376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.988335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.988453 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.988480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.988527 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwsbm\" (UniqueName: \"kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: E1127 16:40:37.988864 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.488847201 +0000 UTC m=+150.506287501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.989269 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.998255 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a3576983abf8818501da49efd85f8c56a2aba52ad0ee0b98f97c9c22c29deffd"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.998299 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c45a33ee56b3795b2195a0e488cc2eddff0871943c7c46fb311a118c8507c098"} Nov 27 16:40:37 crc kubenswrapper[4954]: I1127 16:40:37.999306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.011734 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.087479 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwsbm\" (UniqueName: \"kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm\") pod \"community-operators-5sxql\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.091813 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.098433 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.598395916 +0000 UTC m=+150.615836216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: W1127 16:40:38.102881 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51999cf2_62d7_4ee2_ae9f_b1ac606facb5.slice/crio-dbee31fffe8ec5d1b5663b123013bdaea6f0076a7e4ff708c3b8de0da6f49674 WatchSource:0}: Error finding container dbee31fffe8ec5d1b5663b123013bdaea6f0076a7e4ff708c3b8de0da6f49674: Status 404 returned error can't find the container with id dbee31fffe8ec5d1b5663b123013bdaea6f0076a7e4ff708c3b8de0da6f49674 Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.119500 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:38 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:38 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:38 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.119556 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.136954 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.193407 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.207735 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.696562657 +0000 UTC m=+150.714002957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.253453 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.266708 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.270136 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.272896 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.273166 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.295963 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.296331 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.796314558 +0000 UTC m=+150.813754858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.397990 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.398309 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.398386 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.398764 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:38.898751515 +0000 UTC m=+150.916191815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.416171 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.499047 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.499367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.499417 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.504940 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.004890896 +0000 UTC m=+151.022331196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.505003 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.560637 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.600429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.600820 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.10080292 +0000 UTC m=+151.118243220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.613833 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.645952 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.704350 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.704712 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.204695524 +0000 UTC m=+151.222135824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.808801 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.809185 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.309167763 +0000 UTC m=+151.326608063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.838309 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:40:38 crc kubenswrapper[4954]: W1127 16:40:38.886149 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a256eb1_104a_4da8_b3e7_90eb5c475460.slice/crio-0d76ba801433c15ab8584bed7d6dc3158cbd0c9f92df7823ad2c22931d85d7b7 WatchSource:0}: Error finding container 0d76ba801433c15ab8584bed7d6dc3158cbd0c9f92df7823ad2c22931d85d7b7: Status 404 returned error can't find the container with id 0d76ba801433c15ab8584bed7d6dc3158cbd0c9f92df7823ad2c22931d85d7b7 Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.918093 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.918459 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.418420871 +0000 UTC m=+151.435861181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:38 crc kubenswrapper[4954]: I1127 16:40:38.918880 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:38 crc kubenswrapper[4954]: E1127 16:40:38.919511 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.419496058 +0000 UTC m=+151.436936358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.020735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.020966 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.52093733 +0000 UTC m=+151.538377620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.021116 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.021640 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.521627748 +0000 UTC m=+151.539068048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.072952 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerStarted","Data":"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.073013 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerStarted","Data":"cc8964ee1e94a28cee87ab7c2e8ebf919281a1a716021a01fa7490231323f5e7"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.086796 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:39 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:39 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:39 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.086878 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.102015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" event={"ID":"9c31e969-aba0-4496-8891-283b9f639973","Type":"ContainerStarted","Data":"a6a5da5d2441761d179b5237b3f1e2c70aca6b43403fb2603090d2006123c168"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.115987 4954 generic.go:334] "Generic (PLEG): container finished" podID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerID="ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1" exitCode=0 Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.116055 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerDied","Data":"ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.116087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerStarted","Data":"dbee31fffe8ec5d1b5663b123013bdaea6f0076a7e4ff708c3b8de0da6f49674"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.120984 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.122163 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.122199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerStarted","Data":"88b237adbf82dd3b012178f1bb97a39c1f6fd596710626d04af2560c973d76da"} Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.122425 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.622409875 +0000 UTC m=+151.639850175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.126307 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.146188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerStarted","Data":"0d76ba801433c15ab8584bed7d6dc3158cbd0c9f92df7823ad2c22931d85d7b7"} Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.226653 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.227011 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.726995565 +0000 UTC m=+151.744435865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.328026 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.328320 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.828287025 +0000 UTC m=+151.845727325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.328370 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.328968 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.828950591 +0000 UTC m=+151.846390881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.367231 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.368766 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.372285 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.390488 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.438112 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.438393 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65plc\" (UniqueName: \"kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.438441 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.438541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.438699 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:39.938668752 +0000 UTC m=+151.956109052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.525417 4954 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.539971 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.540017 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.540054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65plc\" (UniqueName: \"kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.540087 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.540807 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.040788381 +0000 UTC m=+152.058228681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.540865 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.540894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.580516 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65plc\" (UniqueName: \"kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc\") pod \"redhat-marketplace-6vln6\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.641438 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.641736 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.141693051 +0000 UTC m=+152.159133351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.642121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.642647 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.142634144 +0000 UTC m=+152.160074614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.683721 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.743391 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.744054 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.244034516 +0000 UTC m=+152.261474816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.764729 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.766288 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.787406 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.845033 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.845107 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8swts\" (UniqueName: \"kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.845133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.845164 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.845462 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.345448837 +0000 UTC m=+152.362889137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.946878 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.947174 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.447139307 +0000 UTC m=+152.464579607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.947419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.947459 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.947507 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.947560 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8swts\" (UniqueName: \"kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.948427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: E1127 16:40:39.948729 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:40:40.448716136 +0000 UTC m=+152.466156436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n2fzm" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.949103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.980289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8swts\" (UniqueName: \"kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts\") pod \"redhat-marketplace-kf5zg\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:39 crc kubenswrapper[4954]: I1127 16:40:39.991533 4954 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-27T16:40:39.525453027Z","Handler":null,"Name":""} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.002615 4954 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.002648 4954 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.048181 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.084094 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.088211 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:40 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:40 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:40 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.088291 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.089100 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.108648 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.119227 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9gfl4" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.149276 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.172631 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.172701 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.184371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ae2cf23-cdc6-4db3-85b3-1854eff90557","Type":"ContainerStarted","Data":"2cde6330989f9bdd614aeba7dc09ce11cfd9320ab52984a7e31f520876d45425"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.184432 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ae2cf23-cdc6-4db3-85b3-1854eff90557","Type":"ContainerStarted","Data":"c022ba63e6a1d2b94a2ce4f78dd6fc6cbc0f387710043bc39461144acbd4f050"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.198062 4954 generic.go:334] "Generic (PLEG): container finished" podID="09166f72-95b5-44d5-b265-705e11740e0c" containerID="47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a" exitCode=0 Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.198134 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerDied","Data":"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.202131 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.202120468 podStartE2EDuration="2.202120468s" podCreationTimestamp="2025-11-27 16:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:40.199094612 +0000 UTC m=+152.216534922" watchObservedRunningTime="2025-11-27 16:40:40.202120468 +0000 UTC m=+152.219560768" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.217150 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerStarted","Data":"cb63ff36edceef9b443a5b14a212aabd343a7ced968ed05dbb143f3052b0e326"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.244953 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.245439 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.245454 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" event={"ID":"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa","Type":"ContainerDied","Data":"9d834dbdd90a2ed8601aa0cf2877e09ac939740a736753205e09592479fa4681"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.245230 4954 generic.go:334] "Generic (PLEG): container finished" podID="d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" containerID="9d834dbdd90a2ed8601aa0cf2877e09ac939740a736753205e09592479fa4681" exitCode=0 Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.266387 4954 patch_prober.go:28] interesting pod/apiserver-76f77b778f-htccg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]log ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]etcd ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/generic-apiserver-start-informers ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/max-in-flight-filter ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 27 16:40:40 crc kubenswrapper[4954]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 27 16:40:40 crc kubenswrapper[4954]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/project.openshift.io-projectcache ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/openshift.io-startinformers ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 27 16:40:40 crc kubenswrapper[4954]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 27 16:40:40 crc kubenswrapper[4954]: livez check failed Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.266479 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-htccg" podUID="3e7aebe6-3e4f-498f-a696-5e23f9fe313d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.274344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" event={"ID":"9c31e969-aba0-4496-8891-283b9f639973","Type":"ContainerStarted","Data":"6af7d6a9cb9802e0db6f295bf8e65165de41c471489d9164390f91b404c6d689"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.280519 4954 generic.go:334] "Generic (PLEG): container finished" podID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerID="a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec" exitCode=0 Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.280671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerDied","Data":"a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.287728 4954 generic.go:334] "Generic (PLEG): container finished" podID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerID="ea38f8bb04ed18687b0862aaa0a31496e11dc6f593feec006fd54e8b03ae85ef" exitCode=0 Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.287774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerDied","Data":"ea38f8bb04ed18687b0862aaa0a31496e11dc6f593feec006fd54e8b03ae85ef"} Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.294945 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n2fzm\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.305249 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xl5dh" podStartSLOduration=12.305229322 podStartE2EDuration="12.305229322s" podCreationTimestamp="2025-11-27 16:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:40.302103464 +0000 UTC m=+152.319543764" watchObservedRunningTime="2025-11-27 16:40:40.305229322 +0000 UTC m=+152.322669622" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.362058 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.396281 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.397838 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.398261 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.401661 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.434804 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.435486 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.438405 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.438862 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.450613 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.467308 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.560296 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.560785 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.560874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5sl\" (UniqueName: \"kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.560898 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.560941 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.586035 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.586105 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.605692 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.619758 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.662239 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.662311 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.662328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.662442 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5sl\" (UniqueName: \"kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.662464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.663387 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.663724 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.664518 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.698719 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5sl\" (UniqueName: \"kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl\") pod \"redhat-operators-vhmbf\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.699388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.705781 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.762848 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.769242 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.773051 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.798186 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.801996 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.808619 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.867438 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.867497 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9p5\" (UniqueName: \"kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.867550 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.958525 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.958559 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.960610 4954 patch_prober.go:28] interesting pod/console-f9d7485db-s8cm2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.960682 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-s8cm2" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.969284 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.969329 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9p5\" (UniqueName: \"kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.969384 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.969900 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.970315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.976049 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.976102 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.976732 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.976754 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:40 crc kubenswrapper[4954]: I1127 16:40:40.992564 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9p5\" (UniqueName: \"kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5\") pod \"redhat-operators-kxdd6\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.084658 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.092603 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:41 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:41 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:41 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.092687 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.117979 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.128538 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:40:41 crc kubenswrapper[4954]: W1127 16:40:41.156271 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c116b3_5000_4043_a04f_ee79ff08a37d.slice/crio-4574a47d6a648dc95d204965b30bfc53c2729857947c9677b4674d7c1cea0dd6 WatchSource:0}: Error finding container 4574a47d6a648dc95d204965b30bfc53c2729857947c9677b4674d7c1cea0dd6: Status 404 returned error can't find the container with id 4574a47d6a648dc95d204965b30bfc53c2729857947c9677b4674d7c1cea0dd6 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.169954 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:40:41 crc kubenswrapper[4954]: W1127 16:40:41.200510 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7272201d_3aa9_48ae_9627_628c53dcdf3a.slice/crio-6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57 WatchSource:0}: Error finding container 6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57: Status 404 returned error can't find the container with id 6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.313032 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerStarted","Data":"4574a47d6a648dc95d204965b30bfc53c2729857947c9677b4674d7c1cea0dd6"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.323184 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerID="66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10" exitCode=0 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.323274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerDied","Data":"66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.327098 4954 generic.go:334] "Generic (PLEG): container finished" podID="24418386-5057-476b-8a29-ad6cf52869f2" containerID="2b5c69b4f5fe1eb8a1dcf78f8ae540b4002049677b635b6fac45c61152e8b06e" exitCode=0 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.327154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerDied","Data":"2b5c69b4f5fe1eb8a1dcf78f8ae540b4002049677b635b6fac45c61152e8b06e"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.327175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerStarted","Data":"f90e1e16487dc44f93f3a3343ebb9c9ed12e71b6e704114b4326009221f878e0"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.335627 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7272201d-3aa9-48ae-9627-628c53dcdf3a","Type":"ContainerStarted","Data":"6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.349448 4954 generic.go:334] "Generic (PLEG): container finished" podID="0ae2cf23-cdc6-4db3-85b3-1854eff90557" containerID="2cde6330989f9bdd614aeba7dc09ce11cfd9320ab52984a7e31f520876d45425" exitCode=0 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.349552 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ae2cf23-cdc6-4db3-85b3-1854eff90557","Type":"ContainerDied","Data":"2cde6330989f9bdd614aeba7dc09ce11cfd9320ab52984a7e31f520876d45425"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.365407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" event={"ID":"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd","Type":"ContainerStarted","Data":"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.365480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" event={"ID":"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd","Type":"ContainerStarted","Data":"3f9baf829b8310f171198e0d01ddb40e075fb95c7e6415706bba933e15254d3e"} Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.368493 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.376715 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lsxk" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.444093 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" podStartSLOduration=132.444071388 podStartE2EDuration="2m12.444071388s" podCreationTimestamp="2025-11-27 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:41.413947763 +0000 UTC m=+153.431388083" watchObservedRunningTime="2025-11-27 16:40:41.444071388 +0000 UTC m=+153.461511678" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.473393 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:40:41 crc kubenswrapper[4954]: W1127 16:40:41.491274 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e52841_3471_4d68_af5b_4c26ac223800.slice/crio-f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530 WatchSource:0}: Error finding container f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530: Status 404 returned error can't find the container with id f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530 Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.725706 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.783072 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume\") pod \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.783236 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7jb\" (UniqueName: \"kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb\") pod \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.783296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume\") pod \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\" (UID: \"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa\") " Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.784292 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" (UID: "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.795033 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" (UID: "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.804451 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb" (OuterVolumeSpecName: "kube-api-access-hm7jb") pod "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" (UID: "d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa"). InnerVolumeSpecName "kube-api-access-hm7jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.884871 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.884934 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:41 crc kubenswrapper[4954]: I1127 16:40:41.884950 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7jb\" (UniqueName: \"kubernetes.io/projected/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa-kube-api-access-hm7jb\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.082827 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:42 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:42 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:42 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.082910 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.379374 4954 generic.go:334] "Generic (PLEG): container finished" podID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerID="d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f" exitCode=0 Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.379462 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerDied","Data":"d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f"} Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.384248 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7272201d-3aa9-48ae-9627-628c53dcdf3a","Type":"ContainerStarted","Data":"d77b9731c6ee0f29a7cbbdbfc19781ba6f7cd37ccf9a02fcf622d7c3247db43a"} Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.393364 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" event={"ID":"d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa","Type":"ContainerDied","Data":"2cd9c14dc50069203d92f84d7626f7c9a0ac759f9713903594d5e1224b4d9f0b"} Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.393416 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd9c14dc50069203d92f84d7626f7c9a0ac759f9713903594d5e1224b4d9f0b" Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.393490 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz" Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.400117 4954 generic.go:334] "Generic (PLEG): container finished" podID="c2e52841-3471-4d68-af5b-4c26ac223800" containerID="6547e759deff8e30fc1e79e0eedb93c09d151c2ebec7c5b5be06cef23d58ee02" exitCode=0 Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.400307 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerDied","Data":"6547e759deff8e30fc1e79e0eedb93c09d151c2ebec7c5b5be06cef23d58ee02"} Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.400383 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerStarted","Data":"f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530"} Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.489769 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.489748009 podStartE2EDuration="2.489748009s" podCreationTimestamp="2025-11-27 16:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:40:42.485235065 +0000 UTC m=+154.502675365" watchObservedRunningTime="2025-11-27 16:40:42.489748009 +0000 UTC m=+154.507188309" Nov 27 16:40:42 crc kubenswrapper[4954]: I1127 16:40:42.936321 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.008184 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir\") pod \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.008336 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0ae2cf23-cdc6-4db3-85b3-1854eff90557" (UID: "0ae2cf23-cdc6-4db3-85b3-1854eff90557"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.008682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access\") pod \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\" (UID: \"0ae2cf23-cdc6-4db3-85b3-1854eff90557\") " Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.009230 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.033435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0ae2cf23-cdc6-4db3-85b3-1854eff90557" (UID: "0ae2cf23-cdc6-4db3-85b3-1854eff90557"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.082692 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:43 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:43 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:43 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.082763 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.110540 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ae2cf23-cdc6-4db3-85b3-1854eff90557-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.448633 4954 generic.go:334] "Generic (PLEG): container finished" podID="7272201d-3aa9-48ae-9627-628c53dcdf3a" containerID="d77b9731c6ee0f29a7cbbdbfc19781ba6f7cd37ccf9a02fcf622d7c3247db43a" exitCode=0 Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.448901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7272201d-3aa9-48ae-9627-628c53dcdf3a","Type":"ContainerDied","Data":"d77b9731c6ee0f29a7cbbdbfc19781ba6f7cd37ccf9a02fcf622d7c3247db43a"} Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.473527 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ae2cf23-cdc6-4db3-85b3-1854eff90557","Type":"ContainerDied","Data":"c022ba63e6a1d2b94a2ce4f78dd6fc6cbc0f387710043bc39461144acbd4f050"} Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.473636 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c022ba63e6a1d2b94a2ce4f78dd6fc6cbc0f387710043bc39461144acbd4f050" Nov 27 16:40:43 crc kubenswrapper[4954]: I1127 16:40:43.473709 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:40:44 crc kubenswrapper[4954]: I1127 16:40:44.081183 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:44 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:44 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:44 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:44 crc kubenswrapper[4954]: I1127 16:40:44.081642 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:44 crc kubenswrapper[4954]: I1127 16:40:44.912483 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.048981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access\") pod \"7272201d-3aa9-48ae-9627-628c53dcdf3a\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.049040 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir\") pod \"7272201d-3aa9-48ae-9627-628c53dcdf3a\" (UID: \"7272201d-3aa9-48ae-9627-628c53dcdf3a\") " Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.049589 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7272201d-3aa9-48ae-9627-628c53dcdf3a" (UID: "7272201d-3aa9-48ae-9627-628c53dcdf3a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.073526 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7272201d-3aa9-48ae-9627-628c53dcdf3a" (UID: "7272201d-3aa9-48ae-9627-628c53dcdf3a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.082628 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:45 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:45 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:45 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.082685 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.150337 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7272201d-3aa9-48ae-9627-628c53dcdf3a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.150385 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7272201d-3aa9-48ae-9627-628c53dcdf3a-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.251614 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.256248 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-htccg" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.537548 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.537967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7272201d-3aa9-48ae-9627-628c53dcdf3a","Type":"ContainerDied","Data":"6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57"} Nov 27 16:40:45 crc kubenswrapper[4954]: I1127 16:40:45.538037 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe97e5042f5147d37618b073f52b50dea3c50e2f4ba09b15af76a211127be57" Nov 27 16:40:46 crc kubenswrapper[4954]: I1127 16:40:46.081987 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:46 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:46 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:46 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:46 crc kubenswrapper[4954]: I1127 16:40:46.082066 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:46 crc kubenswrapper[4954]: I1127 16:40:46.826366 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5clct" Nov 27 16:40:47 crc kubenswrapper[4954]: I1127 16:40:47.081736 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:47 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:47 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:47 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:47 crc kubenswrapper[4954]: I1127 16:40:47.081799 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:48 crc kubenswrapper[4954]: I1127 16:40:48.080598 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:48 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:48 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:48 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:48 crc kubenswrapper[4954]: I1127 16:40:48.080658 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:49 crc kubenswrapper[4954]: I1127 16:40:49.087130 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:49 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:49 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:49 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:49 crc kubenswrapper[4954]: I1127 16:40:49.091053 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.082080 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:50 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:50 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:50 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.082197 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.958198 4954 patch_prober.go:28] interesting pod/console-f9d7485db-s8cm2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.958252 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-s8cm2" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.976705 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.976773 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.976804 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-m78xr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 27 16:40:50 crc kubenswrapper[4954]: I1127 16:40:50.976889 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-m78xr" podUID="6606df87-becb-460d-8579-22c5eb23e71a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 27 16:40:51 crc kubenswrapper[4954]: I1127 16:40:51.084100 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:51 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:51 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:51 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:51 crc kubenswrapper[4954]: I1127 16:40:51.084170 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:52 crc kubenswrapper[4954]: I1127 16:40:52.078342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:52 crc kubenswrapper[4954]: I1127 16:40:52.082520 4954 patch_prober.go:28] interesting pod/router-default-5444994796-2jvzc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:40:52 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Nov 27 16:40:52 crc kubenswrapper[4954]: [+]process-running ok Nov 27 16:40:52 crc kubenswrapper[4954]: healthz check failed Nov 27 16:40:52 crc kubenswrapper[4954]: I1127 16:40:52.082612 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2jvzc" podUID="80debf9d-f71d-491f-b914-82597c9d3162" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:40:52 crc kubenswrapper[4954]: I1127 16:40:52.102037 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af5183f4-5f46-4d64-8ec4-c7b71530cad6-metrics-certs\") pod \"network-metrics-daemon-hgsvh\" (UID: \"af5183f4-5f46-4d64-8ec4-c7b71530cad6\") " pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:52 crc kubenswrapper[4954]: I1127 16:40:52.182148 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgsvh" Nov 27 16:40:53 crc kubenswrapper[4954]: I1127 16:40:53.082487 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:53 crc kubenswrapper[4954]: I1127 16:40:53.084854 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2jvzc" Nov 27 16:40:53 crc kubenswrapper[4954]: I1127 16:40:53.688074 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:40:53 crc kubenswrapper[4954]: I1127 16:40:53.688546 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:41:00 crc kubenswrapper[4954]: I1127 16:41:00.374304 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:41:00 crc kubenswrapper[4954]: I1127 16:41:00.961970 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:41:00 crc kubenswrapper[4954]: I1127 16:41:00.965816 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:41:00 crc kubenswrapper[4954]: I1127 16:41:00.990378 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-m78xr" Nov 27 16:41:10 crc kubenswrapper[4954]: I1127 16:41:10.732101 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m77vv" Nov 27 16:41:12 crc kubenswrapper[4954]: E1127 16:41:12.028298 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 27 16:41:12 crc kubenswrapper[4954]: E1127 16:41:12.028529 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csk6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gflq5_openshift-marketplace(b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:12 crc kubenswrapper[4954]: E1127 16:41:12.029716 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gflq5" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.013837 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:41:16 crc kubenswrapper[4954]: E1127 16:41:16.014757 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae2cf23-cdc6-4db3-85b3-1854eff90557" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.014840 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae2cf23-cdc6-4db3-85b3-1854eff90557" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: E1127 16:41:16.014904 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" containerName="collect-profiles" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.014972 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" containerName="collect-profiles" Nov 27 16:41:16 crc kubenswrapper[4954]: E1127 16:41:16.015042 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7272201d-3aa9-48ae-9627-628c53dcdf3a" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.015103 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7272201d-3aa9-48ae-9627-628c53dcdf3a" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.015268 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" containerName="collect-profiles" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.015342 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae2cf23-cdc6-4db3-85b3-1854eff90557" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.015411 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7272201d-3aa9-48ae-9627-628c53dcdf3a" containerName="pruner" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.015907 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.019100 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.019276 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.038488 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.132670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.133040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.229358 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.234885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.235039 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.235558 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.263622 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:16 crc kubenswrapper[4954]: I1127 16:41:16.385556 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:19 crc kubenswrapper[4954]: E1127 16:41:19.148679 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gflq5" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.016023 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.020555 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.028805 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.106670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.106821 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.106873 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.208150 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.208216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.208252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.208279 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.208335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.228884 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:21 crc kubenswrapper[4954]: I1127 16:41:21.360276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:41:22 crc kubenswrapper[4954]: E1127 16:41:22.061993 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 27 16:41:22 crc kubenswrapper[4954]: E1127 16:41:22.062192 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4b9p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kxdd6_openshift-marketplace(c2e52841-3471-4d68-af5b-4c26ac223800): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:22 crc kubenswrapper[4954]: E1127 16:41:22.065698 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kxdd6" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" Nov 27 16:41:23 crc kubenswrapper[4954]: I1127 16:41:23.688483 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:41:23 crc kubenswrapper[4954]: I1127 16:41:23.688569 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:41:23 crc kubenswrapper[4954]: E1127 16:41:23.886760 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kxdd6" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" Nov 27 16:41:26 crc kubenswrapper[4954]: E1127 16:41:26.319954 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 16:41:26 crc kubenswrapper[4954]: E1127 16:41:26.320173 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65plc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6vln6_openshift-marketplace(8ff1ec67-d5a8-4612-874b-4324db52c148): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:26 crc kubenswrapper[4954]: E1127 16:41:26.321436 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6vln6" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" Nov 27 16:41:27 crc kubenswrapper[4954]: E1127 16:41:27.154152 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6vln6" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.613310 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgsvh"] Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.657088 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.660857 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.828032 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ec488d33-1438-48e5-9ce0-8ad56cf120f0","Type":"ContainerStarted","Data":"b7e3653bf7fb5ea969ad009ec03898d265023c833e41e3cdebfc42b12888fbd1"} Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.829551 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" event={"ID":"af5183f4-5f46-4d64-8ec4-c7b71530cad6","Type":"ContainerStarted","Data":"f1a717bfd09459f861e80c2e390103d96de13865868d34cfa43c83e335e706ad"} Nov 27 16:41:27 crc kubenswrapper[4954]: I1127 16:41:27.830885 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8ad314e-8035-408c-b58d-c41c88fc40fc","Type":"ContainerStarted","Data":"e8a99a1194cd0f52c4d5c801db887f3fbaf531a6328e3444de6eaf4c44fb1256"} Nov 27 16:41:28 crc kubenswrapper[4954]: I1127 16:41:28.838074 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" event={"ID":"af5183f4-5f46-4d64-8ec4-c7b71530cad6","Type":"ContainerStarted","Data":"b0f9940e712959543013aec192e3e66ffc9d358a731d0305ca953f351d907d8d"} Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.939282 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.939448 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8v4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4jtcn_openshift-marketplace(09166f72-95b5-44d5-b265-705e11740e0c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.940797 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4jtcn" podUID="09166f72-95b5-44d5-b265-705e11740e0c" Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.956367 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.956498 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8swts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kf5zg_openshift-marketplace(24418386-5057-476b-8a29-ad6cf52869f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:28 crc kubenswrapper[4954]: E1127 16:41:28.957754 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kf5zg" podUID="24418386-5057-476b-8a29-ad6cf52869f2" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.058959 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.059567 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwsbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5sxql_openshift-marketplace(6a256eb1-104a-4da8-b3e7-90eb5c475460): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.061505 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5sxql" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.160367 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.160531 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pl6wz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wdwtv_openshift-marketplace(51999cf2-62d7-4ee2-ae9f-b1ac606facb5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.161731 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wdwtv" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.267232 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.267412 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f5sl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vhmbf_openshift-marketplace(c8c116b3-5000-4043-a04f-ee79ff08a37d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.268869 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vhmbf" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.846164 4954 generic.go:334] "Generic (PLEG): container finished" podID="ec488d33-1438-48e5-9ce0-8ad56cf120f0" containerID="4f9a294e73984b93ba83d4b1086a14a973b3b70b571ed69ce341dcae722517bc" exitCode=0 Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.846199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ec488d33-1438-48e5-9ce0-8ad56cf120f0","Type":"ContainerDied","Data":"4f9a294e73984b93ba83d4b1086a14a973b3b70b571ed69ce341dcae722517bc"} Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.848873 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgsvh" event={"ID":"af5183f4-5f46-4d64-8ec4-c7b71530cad6","Type":"ContainerStarted","Data":"16e45f5708056ae1ba3041e6508f5b859b99cd6b5f2082929b4e9ff2a770c390"} Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.850964 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8ad314e-8035-408c-b58d-c41c88fc40fc","Type":"ContainerStarted","Data":"14d36211a64d811641f9456b97b5fe2cc44f53c5e83f915bb5a7d6d3f3b1cc39"} Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.853479 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5sxql" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.853697 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wdwtv" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.853700 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4jtcn" podUID="09166f72-95b5-44d5-b265-705e11740e0c" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.853699 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vhmbf" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" Nov 27 16:41:29 crc kubenswrapper[4954]: E1127 16:41:29.853783 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kf5zg" podUID="24418386-5057-476b-8a29-ad6cf52869f2" Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.883196 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hgsvh" podStartSLOduration=181.883176946 podStartE2EDuration="3m1.883176946s" podCreationTimestamp="2025-11-27 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:41:29.880857437 +0000 UTC m=+201.898297757" watchObservedRunningTime="2025-11-27 16:41:29.883176946 +0000 UTC m=+201.900617246" Nov 27 16:41:29 crc kubenswrapper[4954]: I1127 16:41:29.917991 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.917957256 podStartE2EDuration="8.917957256s" podCreationTimestamp="2025-11-27 16:41:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:41:29.916991811 +0000 UTC m=+201.934432111" watchObservedRunningTime="2025-11-27 16:41:29.917957256 +0000 UTC m=+201.935397566" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.105891 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.250434 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access\") pod \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.250529 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir\") pod \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\" (UID: \"ec488d33-1438-48e5-9ce0-8ad56cf120f0\") " Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.250652 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ec488d33-1438-48e5-9ce0-8ad56cf120f0" (UID: "ec488d33-1438-48e5-9ce0-8ad56cf120f0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.250855 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.256403 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ec488d33-1438-48e5-9ce0-8ad56cf120f0" (UID: "ec488d33-1438-48e5-9ce0-8ad56cf120f0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.352244 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec488d33-1438-48e5-9ce0-8ad56cf120f0-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.864587 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ec488d33-1438-48e5-9ce0-8ad56cf120f0","Type":"ContainerDied","Data":"b7e3653bf7fb5ea969ad009ec03898d265023c833e41e3cdebfc42b12888fbd1"} Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.864810 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7e3653bf7fb5ea969ad009ec03898d265023c833e41e3cdebfc42b12888fbd1" Nov 27 16:41:31 crc kubenswrapper[4954]: I1127 16:41:31.864659 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:41:34 crc kubenswrapper[4954]: I1127 16:41:34.882428 4954 generic.go:334] "Generic (PLEG): container finished" podID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerID="063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489" exitCode=0 Nov 27 16:41:34 crc kubenswrapper[4954]: I1127 16:41:34.882519 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerDied","Data":"063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489"} Nov 27 16:41:35 crc kubenswrapper[4954]: I1127 16:41:35.892452 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerStarted","Data":"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1"} Nov 27 16:41:36 crc kubenswrapper[4954]: I1127 16:41:36.392962 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gflq5" podStartSLOduration=4.130852257 podStartE2EDuration="59.392941687s" podCreationTimestamp="2025-11-27 16:40:37 +0000 UTC" firstStartedPulling="2025-11-27 16:40:40.284448082 +0000 UTC m=+152.301888382" lastFinishedPulling="2025-11-27 16:41:35.546537512 +0000 UTC m=+207.563977812" observedRunningTime="2025-11-27 16:41:35.917092577 +0000 UTC m=+207.934532877" watchObservedRunningTime="2025-11-27 16:41:36.392941687 +0000 UTC m=+208.410381987" Nov 27 16:41:36 crc kubenswrapper[4954]: I1127 16:41:36.402424 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:41:37 crc kubenswrapper[4954]: I1127 16:41:37.902956 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerStarted","Data":"029f0d1f4d989692e56e0cb2fd1ee651dd41455f80232b45bb38a073df2139e7"} Nov 27 16:41:37 crc kubenswrapper[4954]: I1127 16:41:37.988961 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:37 crc kubenswrapper[4954]: I1127 16:41:37.989027 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:38 crc kubenswrapper[4954]: I1127 16:41:38.115114 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:38 crc kubenswrapper[4954]: I1127 16:41:38.917869 4954 generic.go:334] "Generic (PLEG): container finished" podID="c2e52841-3471-4d68-af5b-4c26ac223800" containerID="029f0d1f4d989692e56e0cb2fd1ee651dd41455f80232b45bb38a073df2139e7" exitCode=0 Nov 27 16:41:38 crc kubenswrapper[4954]: I1127 16:41:38.917984 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerDied","Data":"029f0d1f4d989692e56e0cb2fd1ee651dd41455f80232b45bb38a073df2139e7"} Nov 27 16:41:40 crc kubenswrapper[4954]: I1127 16:41:40.934546 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerID="ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a" exitCode=0 Nov 27 16:41:40 crc kubenswrapper[4954]: I1127 16:41:40.934635 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerDied","Data":"ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a"} Nov 27 16:41:40 crc kubenswrapper[4954]: I1127 16:41:40.937082 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerStarted","Data":"1a7f5493e4c79ca3f27e8085b89da595ae48972a91b2e7b072d1cb44fc153403"} Nov 27 16:41:40 crc kubenswrapper[4954]: I1127 16:41:40.984565 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxdd6" podStartSLOduration=3.607862437 podStartE2EDuration="1m0.984542315s" podCreationTimestamp="2025-11-27 16:40:40 +0000 UTC" firstStartedPulling="2025-11-27 16:40:42.414274307 +0000 UTC m=+154.431714607" lastFinishedPulling="2025-11-27 16:41:39.790954185 +0000 UTC m=+211.808394485" observedRunningTime="2025-11-27 16:41:40.978639366 +0000 UTC m=+212.996079666" watchObservedRunningTime="2025-11-27 16:41:40.984542315 +0000 UTC m=+213.001982615" Nov 27 16:41:41 crc kubenswrapper[4954]: I1127 16:41:41.118613 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:41 crc kubenswrapper[4954]: I1127 16:41:41.118654 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:41 crc kubenswrapper[4954]: I1127 16:41:41.945642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerStarted","Data":"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e"} Nov 27 16:41:42 crc kubenswrapper[4954]: I1127 16:41:42.164733 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kxdd6" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="registry-server" probeResult="failure" output=< Nov 27 16:41:42 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Nov 27 16:41:42 crc kubenswrapper[4954]: > Nov 27 16:41:42 crc kubenswrapper[4954]: I1127 16:41:42.704828 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vln6" podStartSLOduration=3.601198784 podStartE2EDuration="1m3.704796322s" podCreationTimestamp="2025-11-27 16:40:39 +0000 UTC" firstStartedPulling="2025-11-27 16:40:41.32765712 +0000 UTC m=+153.345097420" lastFinishedPulling="2025-11-27 16:41:41.431254658 +0000 UTC m=+213.448694958" observedRunningTime="2025-11-27 16:41:41.964135181 +0000 UTC m=+213.981575481" watchObservedRunningTime="2025-11-27 16:41:42.704796322 +0000 UTC m=+214.722236622" Nov 27 16:41:42 crc kubenswrapper[4954]: I1127 16:41:42.950631 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerStarted","Data":"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1"} Nov 27 16:41:43 crc kubenswrapper[4954]: I1127 16:41:43.957889 4954 generic.go:334] "Generic (PLEG): container finished" podID="24418386-5057-476b-8a29-ad6cf52869f2" containerID="da48343028814e31eb8ea1d3a36a15b47ce149dba8377534a3944f49d68c310e" exitCode=0 Nov 27 16:41:43 crc kubenswrapper[4954]: I1127 16:41:43.958059 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerDied","Data":"da48343028814e31eb8ea1d3a36a15b47ce149dba8377534a3944f49d68c310e"} Nov 27 16:41:43 crc kubenswrapper[4954]: I1127 16:41:43.965785 4954 generic.go:334] "Generic (PLEG): container finished" podID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerID="f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1" exitCode=0 Nov 27 16:41:43 crc kubenswrapper[4954]: I1127 16:41:43.965848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerDied","Data":"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1"} Nov 27 16:41:44 crc kubenswrapper[4954]: I1127 16:41:44.971788 4954 generic.go:334] "Generic (PLEG): container finished" podID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerID="56fccd0c80c6627557145bce83b5af665099b5e6efcf3076fde5b5cd3a14e312" exitCode=0 Nov 27 16:41:44 crc kubenswrapper[4954]: I1127 16:41:44.971933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerDied","Data":"56fccd0c80c6627557145bce83b5af665099b5e6efcf3076fde5b5cd3a14e312"} Nov 27 16:41:44 crc kubenswrapper[4954]: I1127 16:41:44.975218 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerStarted","Data":"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd"} Nov 27 16:41:44 crc kubenswrapper[4954]: I1127 16:41:44.977974 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerStarted","Data":"fa012e23c6ac12bc17e4899f460833c71faf6176b15cbd073107e00651566983"} Nov 27 16:41:45 crc kubenswrapper[4954]: I1127 16:41:45.061158 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kf5zg" podStartSLOduration=2.981032741 podStartE2EDuration="1m6.061139223s" podCreationTimestamp="2025-11-27 16:40:39 +0000 UTC" firstStartedPulling="2025-11-27 16:40:41.33124127 +0000 UTC m=+153.348681560" lastFinishedPulling="2025-11-27 16:41:44.411347742 +0000 UTC m=+216.428788042" observedRunningTime="2025-11-27 16:41:45.058066835 +0000 UTC m=+217.075507145" watchObservedRunningTime="2025-11-27 16:41:45.061139223 +0000 UTC m=+217.078579533" Nov 27 16:41:45 crc kubenswrapper[4954]: I1127 16:41:45.088041 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhmbf" podStartSLOduration=2.726833772 podStartE2EDuration="1m5.088023483s" podCreationTimestamp="2025-11-27 16:40:40 +0000 UTC" firstStartedPulling="2025-11-27 16:40:42.382462219 +0000 UTC m=+154.399902519" lastFinishedPulling="2025-11-27 16:41:44.74365193 +0000 UTC m=+216.761092230" observedRunningTime="2025-11-27 16:41:45.087048668 +0000 UTC m=+217.104488958" watchObservedRunningTime="2025-11-27 16:41:45.088023483 +0000 UTC m=+217.105463783" Nov 27 16:41:46 crc kubenswrapper[4954]: I1127 16:41:46.993763 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerStarted","Data":"72256241ed2f2968a810de1063c5077208e0fe334530f9435777439a84389fee"} Nov 27 16:41:46 crc kubenswrapper[4954]: I1127 16:41:46.996244 4954 generic.go:334] "Generic (PLEG): container finished" podID="09166f72-95b5-44d5-b265-705e11740e0c" containerID="278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431" exitCode=0 Nov 27 16:41:46 crc kubenswrapper[4954]: I1127 16:41:46.996307 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerDied","Data":"278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431"} Nov 27 16:41:47 crc kubenswrapper[4954]: I1127 16:41:47.002129 4954 generic.go:334] "Generic (PLEG): container finished" podID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerID="1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7" exitCode=0 Nov 27 16:41:47 crc kubenswrapper[4954]: I1127 16:41:47.002168 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerDied","Data":"1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7"} Nov 27 16:41:47 crc kubenswrapper[4954]: I1127 16:41:47.015295 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5sxql" podStartSLOduration=4.3273412350000005 podStartE2EDuration="1m10.015269937s" podCreationTimestamp="2025-11-27 16:40:37 +0000 UTC" firstStartedPulling="2025-11-27 16:40:40.302700509 +0000 UTC m=+152.320140809" lastFinishedPulling="2025-11-27 16:41:45.990629211 +0000 UTC m=+218.008069511" observedRunningTime="2025-11-27 16:41:47.01065575 +0000 UTC m=+219.028096050" watchObservedRunningTime="2025-11-27 16:41:47.015269937 +0000 UTC m=+219.032710247" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.009453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerStarted","Data":"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b"} Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.012954 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerStarted","Data":"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf"} Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.029820 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wdwtv" podStartSLOduration=2.6315909079999997 podStartE2EDuration="1m11.029804577s" podCreationTimestamp="2025-11-27 16:40:37 +0000 UTC" firstStartedPulling="2025-11-27 16:40:39.12064867 +0000 UTC m=+151.138088970" lastFinishedPulling="2025-11-27 16:41:47.518862339 +0000 UTC m=+219.536302639" observedRunningTime="2025-11-27 16:41:48.027133019 +0000 UTC m=+220.044573319" watchObservedRunningTime="2025-11-27 16:41:48.029804577 +0000 UTC m=+220.047244867" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.047361 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4jtcn" podStartSLOduration=3.7981392659999997 podStartE2EDuration="1m11.047342121s" podCreationTimestamp="2025-11-27 16:40:37 +0000 UTC" firstStartedPulling="2025-11-27 16:40:40.216691623 +0000 UTC m=+152.234131923" lastFinishedPulling="2025-11-27 16:41:47.465894478 +0000 UTC m=+219.483334778" observedRunningTime="2025-11-27 16:41:48.046807207 +0000 UTC m=+220.064247507" watchObservedRunningTime="2025-11-27 16:41:48.047342121 +0000 UTC m=+220.064782421" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.050740 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.138085 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.138157 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:41:48 crc kubenswrapper[4954]: I1127 16:41:48.182125 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:41:49 crc kubenswrapper[4954]: I1127 16:41:49.684187 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:41:49 crc kubenswrapper[4954]: I1127 16:41:49.684539 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:41:49 crc kubenswrapper[4954]: I1127 16:41:49.723728 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.057823 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.089598 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.089653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.167518 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.800097 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.800161 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:41:50 crc kubenswrapper[4954]: I1127 16:41:50.856656 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:41:51 crc kubenswrapper[4954]: I1127 16:41:51.067384 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:51 crc kubenswrapper[4954]: I1127 16:41:51.077286 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:41:51 crc kubenswrapper[4954]: I1127 16:41:51.179102 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:51 crc kubenswrapper[4954]: I1127 16:41:51.223648 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.114071 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.114455 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gflq5" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="registry-server" containerID="cri-o://ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1" gracePeriod=2 Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.571515 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.689345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csk6m\" (UniqueName: \"kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m\") pod \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.689429 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content\") pod \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.689501 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities\") pod \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\" (UID: \"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83\") " Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.690350 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities" (OuterVolumeSpecName: "utilities") pod "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" (UID: "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.695543 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m" (OuterVolumeSpecName: "kube-api-access-csk6m") pod "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" (UID: "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83"). InnerVolumeSpecName "kube-api-access-csk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.746897 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" (UID: "b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.790770 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csk6m\" (UniqueName: \"kubernetes.io/projected/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-kube-api-access-csk6m\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.790815 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:52 crc kubenswrapper[4954]: I1127 16:41:52.790825 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.044098 4954 generic.go:334] "Generic (PLEG): container finished" podID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerID="ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1" exitCode=0 Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.044148 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerDied","Data":"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1"} Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.044175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gflq5" event={"ID":"b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83","Type":"ContainerDied","Data":"88b237adbf82dd3b012178f1bb97a39c1f6fd596710626d04af2560c973d76da"} Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.044169 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gflq5" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.044190 4954 scope.go:117] "RemoveContainer" containerID="ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.073361 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.076042 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gflq5"] Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.077603 4954 scope.go:117] "RemoveContainer" containerID="063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.093665 4954 scope.go:117] "RemoveContainer" containerID="a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.127890 4954 scope.go:117] "RemoveContainer" containerID="ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1" Nov 27 16:41:53 crc kubenswrapper[4954]: E1127 16:41:53.129204 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1\": container with ID starting with ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1 not found: ID does not exist" containerID="ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.129236 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1"} err="failed to get container status \"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1\": rpc error: code = NotFound desc = could not find container \"ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1\": container with ID starting with ddcc1c52456bd3c0e97a9ddc0f8eb2de47111342842512375680fb8c38f213c1 not found: ID does not exist" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.129277 4954 scope.go:117] "RemoveContainer" containerID="063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489" Nov 27 16:41:53 crc kubenswrapper[4954]: E1127 16:41:53.130831 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489\": container with ID starting with 063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489 not found: ID does not exist" containerID="063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.130866 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489"} err="failed to get container status \"063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489\": rpc error: code = NotFound desc = could not find container \"063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489\": container with ID starting with 063ef05aef7c547a46234c9f0b5be5e6b07d3f0f09702f66f17c78bd16d43489 not found: ID does not exist" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.130886 4954 scope.go:117] "RemoveContainer" containerID="a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec" Nov 27 16:41:53 crc kubenswrapper[4954]: E1127 16:41:53.131881 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec\": container with ID starting with a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec not found: ID does not exist" containerID="a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.131906 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec"} err="failed to get container status \"a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec\": rpc error: code = NotFound desc = could not find container \"a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec\": container with ID starting with a01aafaf342f2617b7961f96358df086bc2b40b7ce7c6416a6f931fc4c5d49ec not found: ID does not exist" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.687620 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.687678 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.687725 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.688282 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:41:53 crc kubenswrapper[4954]: I1127 16:41:53.688376 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9" gracePeriod=600 Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.054305 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9" exitCode=0 Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.054427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9"} Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.315010 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.315256 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kf5zg" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="registry-server" containerID="cri-o://fa012e23c6ac12bc17e4899f460833c71faf6176b15cbd073107e00651566983" gracePeriod=2 Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.514948 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.520299 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxdd6" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="registry-server" containerID="cri-o://1a7f5493e4c79ca3f27e8085b89da595ae48972a91b2e7b072d1cb44fc153403" gracePeriod=2 Nov 27 16:41:54 crc kubenswrapper[4954]: I1127 16:41:54.670959 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" path="/var/lib/kubelet/pods/b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83/volumes" Nov 27 16:41:55 crc kubenswrapper[4954]: I1127 16:41:55.065754 4954 generic.go:334] "Generic (PLEG): container finished" podID="24418386-5057-476b-8a29-ad6cf52869f2" containerID="fa012e23c6ac12bc17e4899f460833c71faf6176b15cbd073107e00651566983" exitCode=0 Nov 27 16:41:55 crc kubenswrapper[4954]: I1127 16:41:55.065879 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerDied","Data":"fa012e23c6ac12bc17e4899f460833c71faf6176b15cbd073107e00651566983"} Nov 27 16:41:55 crc kubenswrapper[4954]: I1127 16:41:55.068219 4954 generic.go:334] "Generic (PLEG): container finished" podID="c2e52841-3471-4d68-af5b-4c26ac223800" containerID="1a7f5493e4c79ca3f27e8085b89da595ae48972a91b2e7b072d1cb44fc153403" exitCode=0 Nov 27 16:41:55 crc kubenswrapper[4954]: I1127 16:41:55.068267 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerDied","Data":"1a7f5493e4c79ca3f27e8085b89da595ae48972a91b2e7b072d1cb44fc153403"} Nov 27 16:41:55 crc kubenswrapper[4954]: I1127 16:41:55.856558 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.040786 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities\") pod \"24418386-5057-476b-8a29-ad6cf52869f2\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.040932 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content\") pod \"24418386-5057-476b-8a29-ad6cf52869f2\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.040978 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8swts\" (UniqueName: \"kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts\") pod \"24418386-5057-476b-8a29-ad6cf52869f2\" (UID: \"24418386-5057-476b-8a29-ad6cf52869f2\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.041969 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities" (OuterVolumeSpecName: "utilities") pod "24418386-5057-476b-8a29-ad6cf52869f2" (UID: "24418386-5057-476b-8a29-ad6cf52869f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.048797 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts" (OuterVolumeSpecName: "kube-api-access-8swts") pod "24418386-5057-476b-8a29-ad6cf52869f2" (UID: "24418386-5057-476b-8a29-ad6cf52869f2"). InnerVolumeSpecName "kube-api-access-8swts". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.059093 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24418386-5057-476b-8a29-ad6cf52869f2" (UID: "24418386-5057-476b-8a29-ad6cf52869f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.076347 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kf5zg" event={"ID":"24418386-5057-476b-8a29-ad6cf52869f2","Type":"ContainerDied","Data":"f90e1e16487dc44f93f3a3343ebb9c9ed12e71b6e704114b4326009221f878e0"} Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.076350 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kf5zg" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.076424 4954 scope.go:117] "RemoveContainer" containerID="fa012e23c6ac12bc17e4899f460833c71faf6176b15cbd073107e00651566983" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.079072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxdd6" event={"ID":"c2e52841-3471-4d68-af5b-4c26ac223800","Type":"ContainerDied","Data":"f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530"} Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.079114 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92a1071f76f577a78bd3f389685c304579ac21eef6307a9411d602e3b6a2530" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.083654 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4"} Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.085168 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.093401 4954 scope.go:117] "RemoveContainer" containerID="da48343028814e31eb8ea1d3a36a15b47ce149dba8377534a3944f49d68c310e" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.112411 4954 scope.go:117] "RemoveContainer" containerID="2b5c69b4f5fe1eb8a1dcf78f8ae540b4002049677b635b6fac45c61152e8b06e" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.129993 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.133082 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kf5zg"] Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142376 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b9p5\" (UniqueName: \"kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5\") pod \"c2e52841-3471-4d68-af5b-4c26ac223800\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142481 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content\") pod \"c2e52841-3471-4d68-af5b-4c26ac223800\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142554 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities\") pod \"c2e52841-3471-4d68-af5b-4c26ac223800\" (UID: \"c2e52841-3471-4d68-af5b-4c26ac223800\") " Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142815 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142840 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8swts\" (UniqueName: \"kubernetes.io/projected/24418386-5057-476b-8a29-ad6cf52869f2-kube-api-access-8swts\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.142855 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24418386-5057-476b-8a29-ad6cf52869f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.143378 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities" (OuterVolumeSpecName: "utilities") pod "c2e52841-3471-4d68-af5b-4c26ac223800" (UID: "c2e52841-3471-4d68-af5b-4c26ac223800"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.146761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5" (OuterVolumeSpecName: "kube-api-access-4b9p5") pod "c2e52841-3471-4d68-af5b-4c26ac223800" (UID: "c2e52841-3471-4d68-af5b-4c26ac223800"). InnerVolumeSpecName "kube-api-access-4b9p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.242341 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2e52841-3471-4d68-af5b-4c26ac223800" (UID: "c2e52841-3471-4d68-af5b-4c26ac223800"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.243598 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.243704 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b9p5\" (UniqueName: \"kubernetes.io/projected/c2e52841-3471-4d68-af5b-4c26ac223800-kube-api-access-4b9p5\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.243783 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e52841-3471-4d68-af5b-4c26ac223800-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:41:56 crc kubenswrapper[4954]: I1127 16:41:56.669468 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24418386-5057-476b-8a29-ad6cf52869f2" path="/var/lib/kubelet/pods/24418386-5057-476b-8a29-ad6cf52869f2/volumes" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.092326 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxdd6" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.113878 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.116417 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxdd6"] Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.578359 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.578453 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.625672 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.729982 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.730430 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:41:57 crc kubenswrapper[4954]: I1127 16:41:57.771797 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:41:58 crc kubenswrapper[4954]: I1127 16:41:58.160416 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:41:58 crc kubenswrapper[4954]: I1127 16:41:58.167097 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:41:58 crc kubenswrapper[4954]: I1127 16:41:58.187008 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:41:58 crc kubenswrapper[4954]: I1127 16:41:58.670613 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" path="/var/lib/kubelet/pods/c2e52841-3471-4d68-af5b-4c26ac223800/volumes" Nov 27 16:42:00 crc kubenswrapper[4954]: I1127 16:42:00.922047 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:42:00 crc kubenswrapper[4954]: I1127 16:42:00.922668 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5sxql" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="registry-server" containerID="cri-o://72256241ed2f2968a810de1063c5077208e0fe334530f9435777439a84389fee" gracePeriod=2 Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.117193 4954 generic.go:334] "Generic (PLEG): container finished" podID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerID="72256241ed2f2968a810de1063c5077208e0fe334530f9435777439a84389fee" exitCode=0 Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.117257 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerDied","Data":"72256241ed2f2968a810de1063c5077208e0fe334530f9435777439a84389fee"} Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.266794 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.423116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwsbm\" (UniqueName: \"kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm\") pod \"6a256eb1-104a-4da8-b3e7-90eb5c475460\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.423174 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content\") pod \"6a256eb1-104a-4da8-b3e7-90eb5c475460\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.423205 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities\") pod \"6a256eb1-104a-4da8-b3e7-90eb5c475460\" (UID: \"6a256eb1-104a-4da8-b3e7-90eb5c475460\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.424159 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities" (OuterVolumeSpecName: "utilities") pod "6a256eb1-104a-4da8-b3e7-90eb5c475460" (UID: "6a256eb1-104a-4da8-b3e7-90eb5c475460"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.430064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm" (OuterVolumeSpecName: "kube-api-access-dwsbm") pod "6a256eb1-104a-4da8-b3e7-90eb5c475460" (UID: "6a256eb1-104a-4da8-b3e7-90eb5c475460"). InnerVolumeSpecName "kube-api-access-dwsbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.445920 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" containerID="cri-o://d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34" gracePeriod=15 Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.497487 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a256eb1-104a-4da8-b3e7-90eb5c475460" (UID: "6a256eb1-104a-4da8-b3e7-90eb5c475460"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.524231 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwsbm\" (UniqueName: \"kubernetes.io/projected/6a256eb1-104a-4da8-b3e7-90eb5c475460-kube-api-access-dwsbm\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.524284 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.524299 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a256eb1-104a-4da8-b3e7-90eb5c475460-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.800822 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.829902 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.829964 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.829998 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830030 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830057 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830092 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830141 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830197 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830264 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830319 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830346 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830395 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830425 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830507 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbqgs\" (UniqueName: \"kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs\") pod \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\" (UID: \"916d4ddd-2cd9-4595-a1e1-88f0b3908c95\") " Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.830790 4954 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.831101 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.831125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.831568 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.831631 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.836845 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs" (OuterVolumeSpecName: "kube-api-access-gbqgs") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "kube-api-access-gbqgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.836899 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.837954 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.839956 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.841526 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.841911 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.845322 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.845571 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.848455 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "916d4ddd-2cd9-4595-a1e1-88f0b3908c95" (UID: "916d4ddd-2cd9-4595-a1e1-88f0b3908c95"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931122 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931181 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931192 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931203 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931214 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931223 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbqgs\" (UniqueName: \"kubernetes.io/projected/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-kube-api-access-gbqgs\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931232 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931241 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931250 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931260 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931269 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931279 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:01 crc kubenswrapper[4954]: I1127 16:42:01.931291 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/916d4ddd-2cd9-4595-a1e1-88f0b3908c95-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.127439 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sxql" event={"ID":"6a256eb1-104a-4da8-b3e7-90eb5c475460","Type":"ContainerDied","Data":"0d76ba801433c15ab8584bed7d6dc3158cbd0c9f92df7823ad2c22931d85d7b7"} Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.127489 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sxql" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.127680 4954 scope.go:117] "RemoveContainer" containerID="72256241ed2f2968a810de1063c5077208e0fe334530f9435777439a84389fee" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.133346 4954 generic.go:334] "Generic (PLEG): container finished" podID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerID="d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34" exitCode=0 Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.133415 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" event={"ID":"916d4ddd-2cd9-4595-a1e1-88f0b3908c95","Type":"ContainerDied","Data":"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34"} Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.133476 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" event={"ID":"916d4ddd-2cd9-4595-a1e1-88f0b3908c95","Type":"ContainerDied","Data":"5867e39f98c0847ec931dbc4254e3a456302ff93919cf1a90906d758b3fd757b"} Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.133479 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6m2df" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.155037 4954 scope.go:117] "RemoveContainer" containerID="56fccd0c80c6627557145bce83b5af665099b5e6efcf3076fde5b5cd3a14e312" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.168306 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.185703 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5sxql"] Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.189732 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.192701 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6m2df"] Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.193461 4954 scope.go:117] "RemoveContainer" containerID="ea38f8bb04ed18687b0862aaa0a31496e11dc6f593feec006fd54e8b03ae85ef" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.211731 4954 scope.go:117] "RemoveContainer" containerID="d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.234531 4954 scope.go:117] "RemoveContainer" containerID="d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34" Nov 27 16:42:02 crc kubenswrapper[4954]: E1127 16:42:02.235497 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34\": container with ID starting with d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34 not found: ID does not exist" containerID="d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.235574 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34"} err="failed to get container status \"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34\": rpc error: code = NotFound desc = could not find container \"d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34\": container with ID starting with d820bf3856dbc7fc0be89e1ddbaa00e3acacb889302bb242dc1720cad3f5dc34 not found: ID does not exist" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.668513 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" path="/var/lib/kubelet/pods/6a256eb1-104a-4da8-b3e7-90eb5c475460/volumes" Nov 27 16:42:02 crc kubenswrapper[4954]: I1127 16:42:02.669236 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" path="/var/lib/kubelet/pods/916d4ddd-2cd9-4595-a1e1-88f0b3908c95/volumes" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.297341 4954 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298005 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690" gracePeriod=15 Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298054 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a" gracePeriod=15 Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298067 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583" gracePeriod=15 Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298161 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac" gracePeriod=15 Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298919 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.298144 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f" gracePeriod=15 Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300736 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300757 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300766 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300772 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300781 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300786 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300792 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec488d33-1438-48e5-9ce0-8ad56cf120f0" containerName="pruner" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300799 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec488d33-1438-48e5-9ce0-8ad56cf120f0" containerName="pruner" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300806 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300813 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300822 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300829 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300837 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300843 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300852 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300857 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300865 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300871 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300879 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300885 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300893 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300898 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="extract-content" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300906 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300911 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300919 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300925 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300933 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300938 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300947 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300954 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300961 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300966 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300973 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300978 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300985 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.300990 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="extract-utilities" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.300996 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301001 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.301009 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301015 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:42:06 crc kubenswrapper[4954]: E1127 16:42:06.301023 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301028 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301113 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301122 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301130 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301136 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="24418386-5057-476b-8a29-ad6cf52869f2" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301144 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b361b4c8-d72d-4a5f-bb8d-6dd4414eaf83" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301152 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec488d33-1438-48e5-9ce0-8ad56cf120f0" containerName="pruner" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301161 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a256eb1-104a-4da8-b3e7-90eb5c475460" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301167 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="916d4ddd-2cd9-4595-a1e1-88f0b3908c95" containerName="oauth-openshift" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301175 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301184 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301192 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e52841-3471-4d68-af5b-4c26ac223800" containerName="registry-server" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.301357 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.302293 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.302769 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.309879 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.406965 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407029 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407048 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407062 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407082 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407098 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.407150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508227 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508277 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508319 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508354 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508385 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508402 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508417 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508435 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508546 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508568 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508607 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508627 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508646 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:06 crc kubenswrapper[4954]: I1127 16:42:06.508683 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.172823 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.174249 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.175609 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583" exitCode=0 Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.175643 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f" exitCode=0 Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.175655 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a" exitCode=0 Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.175665 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac" exitCode=2 Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.175751 4954 scope.go:117] "RemoveContainer" containerID="1482c61e6954257e6260a0657c7561e067674f24f6febabea67541df86f2221a" Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.178042 4954 generic.go:334] "Generic (PLEG): container finished" podID="b8ad314e-8035-408c-b58d-c41c88fc40fc" containerID="14d36211a64d811641f9456b97b5fe2cc44f53c5e83f915bb5a7d6d3f3b1cc39" exitCode=0 Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.178075 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8ad314e-8035-408c-b58d-c41c88fc40fc","Type":"ContainerDied","Data":"14d36211a64d811641f9456b97b5fe2cc44f53c5e83f915bb5a7d6d3f3b1cc39"} Nov 27 16:42:07 crc kubenswrapper[4954]: I1127 16:42:07.179494 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.190943 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.438763 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.440688 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634229 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock\") pod \"b8ad314e-8035-408c-b58d-c41c88fc40fc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access\") pod \"b8ad314e-8035-408c-b58d-c41c88fc40fc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634364 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir\") pod \"b8ad314e-8035-408c-b58d-c41c88fc40fc\" (UID: \"b8ad314e-8035-408c-b58d-c41c88fc40fc\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634418 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock" (OuterVolumeSpecName: "var-lock") pod "b8ad314e-8035-408c-b58d-c41c88fc40fc" (UID: "b8ad314e-8035-408c-b58d-c41c88fc40fc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634476 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b8ad314e-8035-408c-b58d-c41c88fc40fc" (UID: "b8ad314e-8035-408c-b58d-c41c88fc40fc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634776 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.634807 4954 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8ad314e-8035-408c-b58d-c41c88fc40fc-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.639602 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b8ad314e-8035-408c-b58d-c41c88fc40fc" (UID: "b8ad314e-8035-408c-b58d-c41c88fc40fc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.658326 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.659334 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.660269 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.660657 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.669237 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.669643 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.736647 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8ad314e-8035-408c-b58d-c41c88fc40fc-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.837854 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.837953 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.837982 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838030 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838063 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838139 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838501 4954 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838517 4954 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:08 crc kubenswrapper[4954]: I1127 16:42:08.838529 4954 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.202149 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690" exitCode=0 Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.202218 4954 scope.go:117] "RemoveContainer" containerID="2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.202281 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.203142 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.203425 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.204429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8ad314e-8035-408c-b58d-c41c88fc40fc","Type":"ContainerDied","Data":"e8a99a1194cd0f52c4d5c801db887f3fbaf531a6328e3444de6eaf4c44fb1256"} Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.204466 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8a99a1194cd0f52c4d5c801db887f3fbaf531a6328e3444de6eaf4c44fb1256" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.204511 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.208508 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.208892 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.217244 4954 scope.go:117] "RemoveContainer" containerID="9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.219165 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.219791 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.232806 4954 scope.go:117] "RemoveContainer" containerID="222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.245068 4954 scope.go:117] "RemoveContainer" containerID="6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.261451 4954 scope.go:117] "RemoveContainer" containerID="eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.276983 4954 scope.go:117] "RemoveContainer" containerID="6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.297276 4954 scope.go:117] "RemoveContainer" containerID="2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.297829 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\": container with ID starting with 2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583 not found: ID does not exist" containerID="2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.297869 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583"} err="failed to get container status \"2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\": rpc error: code = NotFound desc = could not find container \"2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583\": container with ID starting with 2e0c27ee0e836a978a842c443087947df93c8e8c84d764f59409411870d2c583 not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.297895 4954 scope.go:117] "RemoveContainer" containerID="9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.298279 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\": container with ID starting with 9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f not found: ID does not exist" containerID="9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.298322 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f"} err="failed to get container status \"9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\": rpc error: code = NotFound desc = could not find container \"9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f\": container with ID starting with 9f3c9604c3580ccb5d962109cbf144a778f0c5b637013a1f75c91c38727cc57f not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.298349 4954 scope.go:117] "RemoveContainer" containerID="222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.298748 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\": container with ID starting with 222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a not found: ID does not exist" containerID="222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.298807 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a"} err="failed to get container status \"222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\": rpc error: code = NotFound desc = could not find container \"222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a\": container with ID starting with 222c3ae14a16649a2e1789618f2f426f51a569cc6782f1093c33cb03f1f90f5a not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.298838 4954 scope.go:117] "RemoveContainer" containerID="6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.299626 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\": container with ID starting with 6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac not found: ID does not exist" containerID="6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.299697 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac"} err="failed to get container status \"6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\": rpc error: code = NotFound desc = could not find container \"6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac\": container with ID starting with 6817cb24a774f87b55270427a184cd7f3e98b3458bb104ff1a083c0d679d28ac not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.299713 4954 scope.go:117] "RemoveContainer" containerID="eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.300165 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\": container with ID starting with eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690 not found: ID does not exist" containerID="eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.300205 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690"} err="failed to get container status \"eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\": rpc error: code = NotFound desc = could not find container \"eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690\": container with ID starting with eb5a619ef8dc16aae6d919d3c755e6a47ccbda5f1ed5e734d8dc3da62da66690 not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.300231 4954 scope.go:117] "RemoveContainer" containerID="6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.300527 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\": container with ID starting with 6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71 not found: ID does not exist" containerID="6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.300553 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71"} err="failed to get container status \"6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\": rpc error: code = NotFound desc = could not find container \"6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71\": container with ID starting with 6329d345ea41df050a79a9f5e0319d14926dcd6be0d5aed5a42b862111a00b71 not found: ID does not exist" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.445022 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.445470 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.445855 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.446607 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.447073 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:09 crc kubenswrapper[4954]: I1127 16:42:09.447113 4954 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.449108 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Nov 27 16:42:09 crc kubenswrapper[4954]: E1127 16:42:09.649657 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Nov 27 16:42:10 crc kubenswrapper[4954]: E1127 16:42:10.050951 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Nov 27 16:42:10 crc kubenswrapper[4954]: I1127 16:42:10.667232 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 27 16:42:10 crc kubenswrapper[4954]: E1127 16:42:10.852680 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Nov 27 16:42:11 crc kubenswrapper[4954]: E1127 16:42:11.346332 4954 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:11 crc kubenswrapper[4954]: I1127 16:42:11.347126 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:11 crc kubenswrapper[4954]: E1127 16:42:11.373479 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187beab27d4d2aa5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:42:11.372968613 +0000 UTC m=+243.390408913,LastTimestamp:2025-11-27 16:42:11.372968613 +0000 UTC m=+243.390408913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:42:12 crc kubenswrapper[4954]: I1127 16:42:12.225126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a"} Nov 27 16:42:12 crc kubenswrapper[4954]: I1127 16:42:12.225441 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fc010685c4f2bdb39495cb50d1c14cae24a512cdf2ffaa5891387cb284a538dd"} Nov 27 16:42:12 crc kubenswrapper[4954]: E1127 16:42:12.226023 4954 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:12 crc kubenswrapper[4954]: I1127 16:42:12.226098 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:12 crc kubenswrapper[4954]: E1127 16:42:12.453773 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Nov 27 16:42:14 crc kubenswrapper[4954]: E1127 16:42:14.965817 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187beab27d4d2aa5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:42:11.372968613 +0000 UTC m=+243.390408913,LastTimestamp:2025-11-27 16:42:11.372968613 +0000 UTC m=+243.390408913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:42:15 crc kubenswrapper[4954]: E1127 16:42:15.654981 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="6.4s" Nov 27 16:42:18 crc kubenswrapper[4954]: I1127 16:42:18.667797 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.224203 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.224367 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.289764 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.289902 4954 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e" exitCode=1 Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.289977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e"} Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.290835 4954 scope.go:117] "RemoveContainer" containerID="6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e" Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.291717 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:20 crc kubenswrapper[4954]: I1127 16:42:20.292415 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.301150 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.301833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41b583b4e519c316397643e7a5bde3da27ebe17860c1272b686b91e9113ec7e4"} Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.303236 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.303754 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.661982 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.662975 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.663907 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.678997 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.679047 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:21 crc kubenswrapper[4954]: E1127 16:42:21.679676 4954 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:21 crc kubenswrapper[4954]: I1127 16:42:21.680220 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:22 crc kubenswrapper[4954]: E1127 16:42:22.056797 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="7s" Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313101 4954 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9a027e72c34c86e18273c400453c1d2abd0542411189d22f25bf3180aa6a70db" exitCode=0 Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9a027e72c34c86e18273c400453c1d2abd0542411189d22f25bf3180aa6a70db"} Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313167 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fd0d19e52892d6f1221495ca7a4c66b5f15fbc08009201e092a60d88472595df"} Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313388 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313398 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.313850 4954 status_manager.go:851] "Failed to get status for pod" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:22 crc kubenswrapper[4954]: I1127 16:42:22.314041 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 16:42:22 crc kubenswrapper[4954]: E1127 16:42:22.314249 4954 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.175226 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.176152 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.176195 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.338884 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b8742bd3abac21a0d30bf2b4df9daef29b4df798b294ac9270cb0ad3110c5268"} Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.339186 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e4f11d571ce736a9174882bfe5470f30761cd25fe258c07d58a097e67a9a52d0"} Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.339200 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"570d2d6e9b532e6a19b621f3b86d3994f57fc4383247caebf279f52f19b54084"} Nov 27 16:42:23 crc kubenswrapper[4954]: I1127 16:42:23.339209 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a687622aa4a4538945de0744edae1b061ab84e93fcec1cd037ccd7468f68ec90"} Nov 27 16:42:24 crc kubenswrapper[4954]: I1127 16:42:24.345059 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bdd97e12c71c8594f366dac5e7445427730f33735ba8cd6e7348bee2c92b5717"} Nov 27 16:42:24 crc kubenswrapper[4954]: I1127 16:42:24.345286 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:24 crc kubenswrapper[4954]: I1127 16:42:24.345379 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:24 crc kubenswrapper[4954]: I1127 16:42:24.345400 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:26 crc kubenswrapper[4954]: I1127 16:42:26.681122 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:26 crc kubenswrapper[4954]: I1127 16:42:26.681167 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:26 crc kubenswrapper[4954]: I1127 16:42:26.690516 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:29 crc kubenswrapper[4954]: I1127 16:42:29.356762 4954 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:29 crc kubenswrapper[4954]: I1127 16:42:29.403064 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="344c83d1-31d5-42a9-8bb0-b449b0b865e4" Nov 27 16:42:30 crc kubenswrapper[4954]: I1127 16:42:30.223094 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:42:30 crc kubenswrapper[4954]: I1127 16:42:30.375846 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:30 crc kubenswrapper[4954]: I1127 16:42:30.375911 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:30 crc kubenswrapper[4954]: I1127 16:42:30.379145 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="344c83d1-31d5-42a9-8bb0-b449b0b865e4" Nov 27 16:42:33 crc kubenswrapper[4954]: I1127 16:42:33.175482 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:42:33 crc kubenswrapper[4954]: I1127 16:42:33.175838 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.202302 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.914018 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.960386 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.980421 4954 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985179 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985244 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-68b6dd9b65-xznzc"] Nov 27 16:42:39 crc kubenswrapper[4954]: E1127 16:42:39.985457 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" containerName="installer" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985471 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" containerName="installer" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985611 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ad314e-8035-408c-b58d-c41c88fc40fc" containerName="installer" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985873 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.985923 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a2b7cd63-bb9a-4c77-b67a-e72adc26393a" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.986035 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.988532 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.988784 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.989821 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.990000 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.992642 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.993198 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.993382 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.993662 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.994404 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.994671 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.995400 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.995429 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 16:42:39 crc kubenswrapper[4954]: I1127 16:42:39.998031 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.002423 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.008042 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.010297 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.035790 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.035766784 podStartE2EDuration="11.035766784s" podCreationTimestamp="2025-11-27 16:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:42:40.031025675 +0000 UTC m=+272.048466005" watchObservedRunningTime="2025-11-27 16:42:40.035766784 +0000 UTC m=+272.053207114" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.161983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24xt\" (UniqueName: \"kubernetes.io/projected/6112a2bd-b96f-4eec-96d4-b18561e93902-kube-api-access-d24xt\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162149 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-policies\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162401 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162456 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-dir\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162489 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.162556 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264171 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24xt\" (UniqueName: \"kubernetes.io/projected/6112a2bd-b96f-4eec-96d4-b18561e93902-kube-api-access-d24xt\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264248 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264284 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264330 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-policies\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264363 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264417 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264491 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264600 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264659 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264704 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-dir\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264734 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.264807 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.267942 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-dir\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.268274 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.268403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-audit-policies\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.270222 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.270310 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.273416 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.273993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.276559 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.276573 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.277228 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.277294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.279949 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.280409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6112a2bd-b96f-4eec-96d4-b18561e93902-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.295182 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24xt\" (UniqueName: \"kubernetes.io/projected/6112a2bd-b96f-4eec-96d4-b18561e93902-kube-api-access-d24xt\") pod \"oauth-openshift-68b6dd9b65-xznzc\" (UID: \"6112a2bd-b96f-4eec-96d4-b18561e93902\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.307202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.336247 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.337085 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.455559 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.669763 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.855191 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 16:42:40 crc kubenswrapper[4954]: I1127 16:42:40.855216 4954 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.026696 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.144497 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.315422 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.438221 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.574070 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.704759 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 16:42:41 crc kubenswrapper[4954]: I1127 16:42:41.816923 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.102750 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.130823 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.176951 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.186476 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.206763 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.304631 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.344419 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.384801 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.693499 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.707939 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.789160 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.968266 4954 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 16:42:42 crc kubenswrapper[4954]: I1127 16:42:42.977544 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.070936 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.086243 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.116758 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.175496 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.175788 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.175853 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.176621 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"41b583b4e519c316397643e7a5bde3da27ebe17860c1272b686b91e9113ec7e4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.176763 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://41b583b4e519c316397643e7a5bde3da27ebe17860c1272b686b91e9113ec7e4" gracePeriod=30 Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.298787 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.342308 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.599559 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.667704 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.770022 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.771497 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.779962 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.784733 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.798125 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.808955 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.855101 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.870869 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.875884 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 16:42:43 crc kubenswrapper[4954]: I1127 16:42:43.986773 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.010524 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.207613 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.224269 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.368204 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.489902 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.537913 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 16:42:44 crc kubenswrapper[4954]: I1127 16:42:44.617295 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.117760 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.166627 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.300040 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.350934 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.407390 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.411821 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.501696 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.531775 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.582504 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.644831 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.682111 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.781097 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.876298 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.887473 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.901076 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:42:45 crc kubenswrapper[4954]: I1127 16:42:45.960816 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.040149 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.067310 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.072879 4954 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.108047 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.111404 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.196924 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.201213 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.211566 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.289119 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.325356 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.329738 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.364168 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.478537 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.627415 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.755659 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.854965 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.895749 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 16:42:46 crc kubenswrapper[4954]: I1127 16:42:46.971681 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.006468 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.115214 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.147106 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.168630 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.245426 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.262846 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.270788 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.272233 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.294501 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.397858 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.417244 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.419322 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.419872 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.462358 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.501677 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.666633 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.675035 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.753618 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 16:42:47 crc kubenswrapper[4954]: I1127 16:42:47.928693 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.271825 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.293461 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.332873 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.485022 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.577243 4954 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.646276 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.683774 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.721031 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.759619 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.760512 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.864359 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.879243 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.885600 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.903536 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.917603 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.949741 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.949778 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 16:42:48 crc kubenswrapper[4954]: I1127 16:42:48.972213 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.074481 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.090598 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.184182 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.226651 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.249217 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.328370 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.329525 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.337815 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.396785 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.410122 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.415624 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.427093 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.462175 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.503668 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.535224 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.543775 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.559008 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.669729 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.702779 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.754623 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 16:42:49 crc kubenswrapper[4954]: I1127 16:42:49.868623 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.182903 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.184853 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.280449 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.325111 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.337603 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.374432 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.385892 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.402742 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.460478 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.565767 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.585968 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.667029 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.752777 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.770271 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.772111 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 16:42:50 crc kubenswrapper[4954]: I1127 16:42:50.808362 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.028148 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.034738 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.076506 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.096384 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.119242 4954 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.145709 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.284540 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.334219 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.357671 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.390075 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.617259 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.617737 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.637691 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.638603 4954 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.638888 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a" gracePeriod=5 Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.694040 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.771801 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.776554 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.793986 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.811734 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.868274 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.905242 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.936135 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.938917 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:42:51 crc kubenswrapper[4954]: I1127 16:42:51.979216 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.096182 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.164738 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.232383 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.260897 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.313993 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.315168 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.315781 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.335336 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.348688 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.378230 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.386871 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.569190 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.629052 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.640133 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.724558 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.766819 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.768337 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.769479 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.833014 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.913208 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.976836 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 16:42:52 crc kubenswrapper[4954]: I1127 16:42:52.989191 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.000958 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.048344 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.053775 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.074916 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.112694 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.237863 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.252604 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.353756 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.733648 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.783326 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 16:42:53 crc kubenswrapper[4954]: I1127 16:42:53.968946 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.075783 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.249247 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.436686 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.519494 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.582440 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.764632 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.926351 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 16:42:54 crc kubenswrapper[4954]: I1127 16:42:54.966187 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 16:42:55 crc kubenswrapper[4954]: I1127 16:42:55.073363 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 16:42:55 crc kubenswrapper[4954]: I1127 16:42:55.489041 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.226256 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.226355 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.389874 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.389978 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390092 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390231 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390275 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390335 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390535 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390858 4954 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390904 4954 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390923 4954 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.390943 4954 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.398735 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.492363 4954 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.552256 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.552351 4954 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a" exitCode=137 Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.552428 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.552488 4954 scope.go:117] "RemoveContainer" containerID="626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.577740 4954 scope.go:117] "RemoveContainer" containerID="626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a" Nov 27 16:42:57 crc kubenswrapper[4954]: E1127 16:42:57.578311 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a\": container with ID starting with 626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a not found: ID does not exist" containerID="626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a" Nov 27 16:42:57 crc kubenswrapper[4954]: I1127 16:42:57.578349 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a"} err="failed to get container status \"626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a\": rpc error: code = NotFound desc = could not find container \"626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a\": container with ID starting with 626d31197c9491db97f312abc22403c3ae8c7ca1e0f640057b648681d0a23b6a not found: ID does not exist" Nov 27 16:42:58 crc kubenswrapper[4954]: I1127 16:42:58.670688 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 27 16:43:06 crc kubenswrapper[4954]: I1127 16:43:06.611236 4954 generic.go:334] "Generic (PLEG): container finished" podID="a08ef380-6670-415c-9861-71c9161f1a4c" containerID="759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb" exitCode=0 Nov 27 16:43:06 crc kubenswrapper[4954]: I1127 16:43:06.611334 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerDied","Data":"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb"} Nov 27 16:43:06 crc kubenswrapper[4954]: I1127 16:43:06.612324 4954 scope.go:117] "RemoveContainer" containerID="759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb" Nov 27 16:43:07 crc kubenswrapper[4954]: I1127 16:43:07.627459 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerStarted","Data":"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322"} Nov 27 16:43:07 crc kubenswrapper[4954]: I1127 16:43:07.628319 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:43:07 crc kubenswrapper[4954]: I1127 16:43:07.630850 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:43:08 crc kubenswrapper[4954]: I1127 16:43:08.435903 4954 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 27 16:43:08 crc kubenswrapper[4954]: I1127 16:43:08.625093 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:43:11 crc kubenswrapper[4954]: I1127 16:43:11.340097 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 16:43:12 crc kubenswrapper[4954]: I1127 16:43:12.375274 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 16:43:12 crc kubenswrapper[4954]: I1127 16:43:12.563734 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6dd9b65-xznzc"] Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.048686 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6dd9b65-xznzc"] Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.714099 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" event={"ID":"6112a2bd-b96f-4eec-96d4-b18561e93902","Type":"ContainerStarted","Data":"807eda08e8ab477e7e0dc660457d2669dc130773c3cc4889d6f4a0f41a9beb2d"} Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.714773 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.715106 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" event={"ID":"6112a2bd-b96f-4eec-96d4-b18561e93902","Type":"ContainerStarted","Data":"cc5ffac5b95cc450e36d5b536865e0e6613beb85f27d56656cf91315e478d577"} Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.720636 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.722845 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.722934 4954 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="41b583b4e519c316397643e7a5bde3da27ebe17860c1272b686b91e9113ec7e4" exitCode=137 Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.723001 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"41b583b4e519c316397643e7a5bde3da27ebe17860c1272b686b91e9113ec7e4"} Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.723044 4954 scope.go:117] "RemoveContainer" containerID="6dbb0d73cb9bddb6148625592ed1aac95ead1e2349f92fb8aba36ec714ed618e" Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.756409 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" podStartSLOduration=97.756389201 podStartE2EDuration="1m37.756389201s" podCreationTimestamp="2025-11-27 16:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:43:13.755339976 +0000 UTC m=+305.772780286" watchObservedRunningTime="2025-11-27 16:43:13.756389201 +0000 UTC m=+305.773829501" Nov 27 16:43:13 crc kubenswrapper[4954]: I1127 16:43:13.895646 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68b6dd9b65-xznzc" Nov 27 16:43:14 crc kubenswrapper[4954]: I1127 16:43:14.731224 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 27 16:43:14 crc kubenswrapper[4954]: I1127 16:43:14.733795 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0e8bcc07692311d2425056308c0e6a9702a924da06c197836509eaa5c164502"} Nov 27 16:43:14 crc kubenswrapper[4954]: I1127 16:43:14.771213 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 16:43:14 crc kubenswrapper[4954]: I1127 16:43:14.990833 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 16:43:17 crc kubenswrapper[4954]: I1127 16:43:17.665881 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 16:43:19 crc kubenswrapper[4954]: I1127 16:43:19.203551 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 16:43:19 crc kubenswrapper[4954]: I1127 16:43:19.219177 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 16:43:19 crc kubenswrapper[4954]: I1127 16:43:19.227279 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 16:43:20 crc kubenswrapper[4954]: I1127 16:43:20.223526 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:43:23 crc kubenswrapper[4954]: I1127 16:43:23.175623 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:43:23 crc kubenswrapper[4954]: I1127 16:43:23.181882 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:43:27 crc kubenswrapper[4954]: I1127 16:43:27.130274 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 16:43:30 crc kubenswrapper[4954]: I1127 16:43:30.227959 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.425547 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.425968 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.426149 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerName="route-controller-manager" containerID="cri-o://f467d62914eade0f151113915f0669ca492deef458ab407c5bef188eaf9a166c" gracePeriod=30 Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.426333 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerName="controller-manager" containerID="cri-o://ce7951a9306b662396c84e314cad126080d4ed8fb027a5c3883f10c25c66cea7" gracePeriod=30 Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.873143 4954 generic.go:334] "Generic (PLEG): container finished" podID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerID="f467d62914eade0f151113915f0669ca492deef458ab407c5bef188eaf9a166c" exitCode=0 Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.873189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" event={"ID":"6e521fb1-0565-4f66-a6f0-1b78942e408e","Type":"ContainerDied","Data":"f467d62914eade0f151113915f0669ca492deef458ab407c5bef188eaf9a166c"} Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.873543 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" event={"ID":"6e521fb1-0565-4f66-a6f0-1b78942e408e","Type":"ContainerDied","Data":"2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021"} Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.873559 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc534f8705309f2620166ee7fc3142efa6f4e04a29d482671b009aec4225021" Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.876036 4954 generic.go:334] "Generic (PLEG): container finished" podID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerID="ce7951a9306b662396c84e314cad126080d4ed8fb027a5c3883f10c25c66cea7" exitCode=0 Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.876076 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" event={"ID":"24f553f1-7b7b-4d3e-addf-2b5d1039f176","Type":"ContainerDied","Data":"ce7951a9306b662396c84e314cad126080d4ed8fb027a5c3883f10c25c66cea7"} Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.876101 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" event={"ID":"24f553f1-7b7b-4d3e-addf-2b5d1039f176","Type":"ContainerDied","Data":"10568d07ed2af5fafeeec0a95d7590fcc65eebb5b22bb8f0a64a67abe1e7fb30"} Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.876111 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10568d07ed2af5fafeeec0a95d7590fcc65eebb5b22bb8f0a64a67abe1e7fb30" Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.878939 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:43:32 crc kubenswrapper[4954]: I1127 16:43:32.882841 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017028 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6thft\" (UniqueName: \"kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft\") pod \"6e521fb1-0565-4f66-a6f0-1b78942e408e\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017097 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca\") pod \"6e521fb1-0565-4f66-a6f0-1b78942e408e\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017125 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config\") pod \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017255 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles\") pod \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68w5b\" (UniqueName: \"kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b\") pod \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017325 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca\") pod \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017352 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert\") pod \"6e521fb1-0565-4f66-a6f0-1b78942e408e\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017384 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert\") pod \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\" (UID: \"24f553f1-7b7b-4d3e-addf-2b5d1039f176\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.017411 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config\") pod \"6e521fb1-0565-4f66-a6f0-1b78942e408e\" (UID: \"6e521fb1-0565-4f66-a6f0-1b78942e408e\") " Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.018239 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e521fb1-0565-4f66-a6f0-1b78942e408e" (UID: "6e521fb1-0565-4f66-a6f0-1b78942e408e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.018347 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config" (OuterVolumeSpecName: "config") pod "6e521fb1-0565-4f66-a6f0-1b78942e408e" (UID: "6e521fb1-0565-4f66-a6f0-1b78942e408e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.019131 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24f553f1-7b7b-4d3e-addf-2b5d1039f176" (UID: "24f553f1-7b7b-4d3e-addf-2b5d1039f176"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.019156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config" (OuterVolumeSpecName: "config") pod "24f553f1-7b7b-4d3e-addf-2b5d1039f176" (UID: "24f553f1-7b7b-4d3e-addf-2b5d1039f176"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.019302 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca" (OuterVolumeSpecName: "client-ca") pod "24f553f1-7b7b-4d3e-addf-2b5d1039f176" (UID: "24f553f1-7b7b-4d3e-addf-2b5d1039f176"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.025154 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24f553f1-7b7b-4d3e-addf-2b5d1039f176" (UID: "24f553f1-7b7b-4d3e-addf-2b5d1039f176"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.025315 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e521fb1-0565-4f66-a6f0-1b78942e408e" (UID: "6e521fb1-0565-4f66-a6f0-1b78942e408e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.025391 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft" (OuterVolumeSpecName: "kube-api-access-6thft") pod "6e521fb1-0565-4f66-a6f0-1b78942e408e" (UID: "6e521fb1-0565-4f66-a6f0-1b78942e408e"). InnerVolumeSpecName "kube-api-access-6thft". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.025591 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b" (OuterVolumeSpecName: "kube-api-access-68w5b") pod "24f553f1-7b7b-4d3e-addf-2b5d1039f176" (UID: "24f553f1-7b7b-4d3e-addf-2b5d1039f176"). InnerVolumeSpecName "kube-api-access-68w5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118712 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118748 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118762 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118776 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68w5b\" (UniqueName: \"kubernetes.io/projected/24f553f1-7b7b-4d3e-addf-2b5d1039f176-kube-api-access-68w5b\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118788 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f553f1-7b7b-4d3e-addf-2b5d1039f176-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118801 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e521fb1-0565-4f66-a6f0-1b78942e408e-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118814 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f553f1-7b7b-4d3e-addf-2b5d1039f176-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118826 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e521fb1-0565-4f66-a6f0-1b78942e408e-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.118840 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6thft\" (UniqueName: \"kubernetes.io/projected/6e521fb1-0565-4f66-a6f0-1b78942e408e-kube-api-access-6thft\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.491708 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:43:33 crc kubenswrapper[4954]: E1127 16:43:33.492299 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerName="route-controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492328 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerName="route-controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: E1127 16:43:33.492360 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerName="controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492375 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerName="controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: E1127 16:43:33.492401 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492417 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492634 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" containerName="controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492656 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.492677 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" containerName="route-controller-manager" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.493503 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.500296 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.501896 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.513313 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.521875 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.625553 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.625695 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.625795 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.625902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.626033 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.626067 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.626100 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.626224 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd4vd\" (UniqueName: \"kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.626303 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lsz\" (UniqueName: \"kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.727465 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.727638 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.727750 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.727890 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.728847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.728941 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd4vd\" (UniqueName: \"kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.729032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lsz\" (UniqueName: \"kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.729100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.729152 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.730263 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.730439 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.730570 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.730736 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.731154 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.735723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.736785 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.757235 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lsz\" (UniqueName: \"kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz\") pod \"route-controller-manager-5fdf45db69-fr7nb\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.770382 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd4vd\" (UniqueName: \"kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd\") pod \"controller-manager-7f6b5bdb65-8bbfd\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.820256 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.832744 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.899941 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8wlxw" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.900424 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h" Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.942399 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.953852 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8wlxw"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.959073 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:43:33 crc kubenswrapper[4954]: I1127 16:43:33.965113 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f6f2h"] Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.128393 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:43:34 crc kubenswrapper[4954]: W1127 16:43:34.136302 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e08fcc_f990_4f48_9f5f_3b0925cbf5c7.slice/crio-d961cd3e5c28c3577e9f3aad27b4c54d953e16c7806bdf4f88ed6dc002d2b986 WatchSource:0}: Error finding container d961cd3e5c28c3577e9f3aad27b4c54d953e16c7806bdf4f88ed6dc002d2b986: Status 404 returned error can't find the container with id d961cd3e5c28c3577e9f3aad27b4c54d953e16c7806bdf4f88ed6dc002d2b986 Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.150760 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:43:34 crc kubenswrapper[4954]: W1127 16:43:34.174387 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6faccf57_f78d_41c3_b176_691802bdc64b.slice/crio-890e60a2d073d402650d4b61b15618cfa8482eae6f0b5a885fdeb9023033008c WatchSource:0}: Error finding container 890e60a2d073d402650d4b61b15618cfa8482eae6f0b5a885fdeb9023033008c: Status 404 returned error can't find the container with id 890e60a2d073d402650d4b61b15618cfa8482eae6f0b5a885fdeb9023033008c Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.670393 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f553f1-7b7b-4d3e-addf-2b5d1039f176" path="/var/lib/kubelet/pods/24f553f1-7b7b-4d3e-addf-2b5d1039f176/volumes" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.671372 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e521fb1-0565-4f66-a6f0-1b78942e408e" path="/var/lib/kubelet/pods/6e521fb1-0565-4f66-a6f0-1b78942e408e/volumes" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.906199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" event={"ID":"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7","Type":"ContainerStarted","Data":"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0"} Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.906256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" event={"ID":"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7","Type":"ContainerStarted","Data":"d961cd3e5c28c3577e9f3aad27b4c54d953e16c7806bdf4f88ed6dc002d2b986"} Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.906764 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.908385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" event={"ID":"6faccf57-f78d-41c3-b176-691802bdc64b","Type":"ContainerStarted","Data":"642ad17e42f1fd5886933d369e39858fc8bcfb90eb80bba2d17a081cae87e7af"} Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.908432 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" event={"ID":"6faccf57-f78d-41c3-b176-691802bdc64b","Type":"ContainerStarted","Data":"890e60a2d073d402650d4b61b15618cfa8482eae6f0b5a885fdeb9023033008c"} Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.908450 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.912717 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.925694 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.938869 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" podStartSLOduration=2.938851807 podStartE2EDuration="2.938851807s" podCreationTimestamp="2025-11-27 16:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:43:34.935534285 +0000 UTC m=+326.952974605" watchObservedRunningTime="2025-11-27 16:43:34.938851807 +0000 UTC m=+326.956292107" Nov 27 16:43:34 crc kubenswrapper[4954]: I1127 16:43:34.965030 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" podStartSLOduration=2.965011723 podStartE2EDuration="2.965011723s" podCreationTimestamp="2025-11-27 16:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:43:34.963074193 +0000 UTC m=+326.980514493" watchObservedRunningTime="2025-11-27 16:43:34.965011723 +0000 UTC m=+326.982452023" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.351637 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.352386 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wdwtv" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="registry-server" containerID="cri-o://6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.370955 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.371264 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4jtcn" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="registry-server" containerID="cri-o://799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.375777 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.376035 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" containerID="cri-o://95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.381829 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.392158 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.392436 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vhmbf" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="registry-server" containerID="cri-o://5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.392914 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vln6" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="registry-server" containerID="cri-o://d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.401696 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txfqr"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.414874 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txfqr"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.402500 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.535896 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.535943 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2h6f\" (UniqueName: \"kubernetes.io/projected/8215930a-947b-45d7-9c4e-9d867d3f234e-kube-api-access-h2h6f\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.535974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.551415 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.551721 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" podUID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" containerName="controller-manager" containerID="cri-o://e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.636706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.636757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2h6f\" (UniqueName: \"kubernetes.io/projected/8215930a-947b-45d7-9c4e-9d867d3f234e-kube-api-access-h2h6f\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.636788 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.638845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.643277 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8215930a-947b-45d7-9c4e-9d867d3f234e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.653270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2h6f\" (UniqueName: \"kubernetes.io/projected/8215930a-947b-45d7-9c4e-9d867d3f234e-kube-api-access-h2h6f\") pod \"marketplace-operator-79b997595-txfqr\" (UID: \"8215930a-947b-45d7-9c4e-9d867d3f234e\") " pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.667514 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.667951 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" podUID="6faccf57-f78d-41c3-b176-691802bdc64b" containerName="route-controller-manager" containerID="cri-o://642ad17e42f1fd5886933d369e39858fc8bcfb90eb80bba2d17a081cae87e7af" gracePeriod=30 Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.738901 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.900835 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.966561 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics\") pod \"a08ef380-6670-415c-9861-71c9161f1a4c\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.966657 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca\") pod \"a08ef380-6670-415c-9861-71c9161f1a4c\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.966727 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnd8x\" (UniqueName: \"kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x\") pod \"a08ef380-6670-415c-9861-71c9161f1a4c\" (UID: \"a08ef380-6670-415c-9861-71c9161f1a4c\") " Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.971002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a08ef380-6670-415c-9861-71c9161f1a4c" (UID: "a08ef380-6670-415c-9861-71c9161f1a4c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.979417 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x" (OuterVolumeSpecName: "kube-api-access-lnd8x") pod "a08ef380-6670-415c-9861-71c9161f1a4c" (UID: "a08ef380-6670-415c-9861-71c9161f1a4c"). InnerVolumeSpecName "kube-api-access-lnd8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:21 crc kubenswrapper[4954]: I1127 16:44:21.985989 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a08ef380-6670-415c-9861-71c9161f1a4c" (UID: "a08ef380-6670-415c-9861-71c9161f1a4c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.023230 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.054504 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069477 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content\") pod \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl6wz\" (UniqueName: \"kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz\") pod \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069559 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities\") pod \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\" (UID: \"51999cf2-62d7-4ee2-ae9f-b1ac606facb5\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069781 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069798 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a08ef380-6670-415c-9861-71c9161f1a4c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.069807 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnd8x\" (UniqueName: \"kubernetes.io/projected/a08ef380-6670-415c-9861-71c9161f1a4c-kube-api-access-lnd8x\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.070518 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities" (OuterVolumeSpecName: "utilities") pod "51999cf2-62d7-4ee2-ae9f-b1ac606facb5" (UID: "51999cf2-62d7-4ee2-ae9f-b1ac606facb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.074646 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz" (OuterVolumeSpecName: "kube-api-access-pl6wz") pod "51999cf2-62d7-4ee2-ae9f-b1ac606facb5" (UID: "51999cf2-62d7-4ee2-ae9f-b1ac606facb5"). InnerVolumeSpecName "kube-api-access-pl6wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.142912 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51999cf2-62d7-4ee2-ae9f-b1ac606facb5" (UID: "51999cf2-62d7-4ee2-ae9f-b1ac606facb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.158903 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.170570 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content\") pod \"09166f72-95b5-44d5-b265-705e11740e0c\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.170723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8v4w\" (UniqueName: \"kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w\") pod \"09166f72-95b5-44d5-b265-705e11740e0c\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.170754 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities\") pod \"09166f72-95b5-44d5-b265-705e11740e0c\" (UID: \"09166f72-95b5-44d5-b265-705e11740e0c\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.171000 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.171020 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl6wz\" (UniqueName: \"kubernetes.io/projected/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-kube-api-access-pl6wz\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.171034 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51999cf2-62d7-4ee2-ae9f-b1ac606facb5-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.172316 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities" (OuterVolumeSpecName: "utilities") pod "09166f72-95b5-44d5-b265-705e11740e0c" (UID: "09166f72-95b5-44d5-b265-705e11740e0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.174182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w" (OuterVolumeSpecName: "kube-api-access-n8v4w") pod "09166f72-95b5-44d5-b265-705e11740e0c" (UID: "09166f72-95b5-44d5-b265-705e11740e0c"). InnerVolumeSpecName "kube-api-access-n8v4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.197792 4954 generic.go:334] "Generic (PLEG): container finished" podID="a08ef380-6670-415c-9861-71c9161f1a4c" containerID="95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.197952 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.198245 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerDied","Data":"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.198290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qmz7n" event={"ID":"a08ef380-6670-415c-9861-71c9161f1a4c","Type":"ContainerDied","Data":"ed1a0de66b1b47772a8c20e0f6bf4d953b3f42ac4f7572ce4541c9394e166e2a"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.198309 4954 scope.go:117] "RemoveContainer" containerID="95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.201544 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.202100 4954 generic.go:334] "Generic (PLEG): container finished" podID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerID="6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.202178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerDied","Data":"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.202208 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdwtv" event={"ID":"51999cf2-62d7-4ee2-ae9f-b1ac606facb5","Type":"ContainerDied","Data":"dbee31fffe8ec5d1b5663b123013bdaea6f0076a7e4ff708c3b8de0da6f49674"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.202355 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdwtv" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.204255 4954 generic.go:334] "Generic (PLEG): container finished" podID="6faccf57-f78d-41c3-b176-691802bdc64b" containerID="642ad17e42f1fd5886933d369e39858fc8bcfb90eb80bba2d17a081cae87e7af" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.204330 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" event={"ID":"6faccf57-f78d-41c3-b176-691802bdc64b","Type":"ContainerDied","Data":"642ad17e42f1fd5886933d369e39858fc8bcfb90eb80bba2d17a081cae87e7af"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.209974 4954 generic.go:334] "Generic (PLEG): container finished" podID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" containerID="e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.210037 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" event={"ID":"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7","Type":"ContainerDied","Data":"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.210074 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" event={"ID":"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7","Type":"ContainerDied","Data":"d961cd3e5c28c3577e9f3aad27b4c54d953e16c7806bdf4f88ed6dc002d2b986"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.210144 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.210513 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.214497 4954 generic.go:334] "Generic (PLEG): container finished" podID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerID="5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.214547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerDied","Data":"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.214600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhmbf" event={"ID":"c8c116b3-5000-4043-a04f-ee79ff08a37d","Type":"ContainerDied","Data":"4574a47d6a648dc95d204965b30bfc53c2729857947c9677b4674d7c1cea0dd6"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.224836 4954 generic.go:334] "Generic (PLEG): container finished" podID="09166f72-95b5-44d5-b265-705e11740e0c" containerID="799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.224893 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerDied","Data":"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.224915 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jtcn" event={"ID":"09166f72-95b5-44d5-b265-705e11740e0c","Type":"ContainerDied","Data":"cc8964ee1e94a28cee87ab7c2e8ebf919281a1a716021a01fa7490231323f5e7"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.224999 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jtcn" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.231180 4954 scope.go:117] "RemoveContainer" containerID="759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.231276 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerID="d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e" exitCode=0 Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.231327 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vln6" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.231326 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerDied","Data":"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.231361 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vln6" event={"ID":"8ff1ec67-d5a8-4612-874b-4324db52c148","Type":"ContainerDied","Data":"cb63ff36edceef9b443a5b14a212aabd343a7ced968ed05dbb143f3052b0e326"} Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.241279 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09166f72-95b5-44d5-b265-705e11740e0c" (UID: "09166f72-95b5-44d5-b265-705e11740e0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.245210 4954 scope.go:117] "RemoveContainer" containerID="95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.245535 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322\": container with ID starting with 95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322 not found: ID does not exist" containerID="95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.245658 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322"} err="failed to get container status \"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322\": rpc error: code = NotFound desc = could not find container \"95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322\": container with ID starting with 95b540ec2024011ded83698ecf256909a35c153124eab98ef1a77229aa45f322 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.245758 4954 scope.go:117] "RemoveContainer" containerID="759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.246884 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb\": container with ID starting with 759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb not found: ID does not exist" containerID="759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.246914 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb"} err="failed to get container status \"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb\": rpc error: code = NotFound desc = could not find container \"759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb\": container with ID starting with 759a4732493d7e795dce581798cff3449b618dae5b18e27e0bf25d64cdccbadb not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.246935 4954 scope.go:117] "RemoveContainer" containerID="6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.266792 4954 scope.go:117] "RemoveContainer" containerID="1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities\") pod \"c8c116b3-5000-4043-a04f-ee79ff08a37d\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272160 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content\") pod \"c8c116b3-5000-4043-a04f-ee79ff08a37d\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272216 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities\") pod \"8ff1ec67-d5a8-4612-874b-4324db52c148\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272265 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f5sl\" (UniqueName: \"kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl\") pod \"c8c116b3-5000-4043-a04f-ee79ff08a37d\" (UID: \"c8c116b3-5000-4043-a04f-ee79ff08a37d\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272335 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config\") pod \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272357 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content\") pod \"8ff1ec67-d5a8-4612-874b-4324db52c148\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272385 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles\") pod \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65plc\" (UniqueName: \"kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc\") pod \"8ff1ec67-d5a8-4612-874b-4324db52c148\" (UID: \"8ff1ec67-d5a8-4612-874b-4324db52c148\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272421 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert\") pod \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272441 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd4vd\" (UniqueName: \"kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd\") pod \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272466 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca\") pod \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\" (UID: \"16e08fcc-f990-4f48-9f5f-3b0925cbf5c7\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272692 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8v4w\" (UniqueName: \"kubernetes.io/projected/09166f72-95b5-44d5-b265-705e11740e0c-kube-api-access-n8v4w\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272711 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.272721 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09166f72-95b5-44d5-b265-705e11740e0c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.273911 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" (UID: "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.274718 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.274778 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities" (OuterVolumeSpecName: "utilities") pod "c8c116b3-5000-4043-a04f-ee79ff08a37d" (UID: "c8c116b3-5000-4043-a04f-ee79ff08a37d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.277064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config" (OuterVolumeSpecName: "config") pod "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" (UID: "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.278689 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities" (OuterVolumeSpecName: "utilities") pod "8ff1ec67-d5a8-4612-874b-4324db52c148" (UID: "8ff1ec67-d5a8-4612-874b-4324db52c148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.280517 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qmz7n"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.280874 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" (UID: "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.281340 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" (UID: "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.284647 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc" (OuterVolumeSpecName: "kube-api-access-65plc") pod "8ff1ec67-d5a8-4612-874b-4324db52c148" (UID: "8ff1ec67-d5a8-4612-874b-4324db52c148"). InnerVolumeSpecName "kube-api-access-65plc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.287332 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd" (OuterVolumeSpecName: "kube-api-access-zd4vd") pod "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" (UID: "16e08fcc-f990-4f48-9f5f-3b0925cbf5c7"). InnerVolumeSpecName "kube-api-access-zd4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.287474 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl" (OuterVolumeSpecName: "kube-api-access-7f5sl") pod "c8c116b3-5000-4043-a04f-ee79ff08a37d" (UID: "c8c116b3-5000-4043-a04f-ee79ff08a37d"). InnerVolumeSpecName "kube-api-access-7f5sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.287515 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.291146 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.301859 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wdwtv"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.305719 4954 scope.go:117] "RemoveContainer" containerID="ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.329049 4954 scope.go:117] "RemoveContainer" containerID="6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.329414 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b\": container with ID starting with 6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b not found: ID does not exist" containerID="6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.329451 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b"} err="failed to get container status \"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b\": rpc error: code = NotFound desc = could not find container \"6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b\": container with ID starting with 6a893d5f338d04ee1f070a24dc8a1044201013e4b91cbcb55746bdd764669c3b not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.329473 4954 scope.go:117] "RemoveContainer" containerID="1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.329694 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7\": container with ID starting with 1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7 not found: ID does not exist" containerID="1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.329714 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7"} err="failed to get container status \"1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7\": rpc error: code = NotFound desc = could not find container \"1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7\": container with ID starting with 1b6a430a39d2f8e92183723141e67e097e003d1d283c2d5b546733a37de1f6d7 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.329726 4954 scope.go:117] "RemoveContainer" containerID="ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.329994 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1\": container with ID starting with ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1 not found: ID does not exist" containerID="ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.330014 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1"} err="failed to get container status \"ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1\": rpc error: code = NotFound desc = could not find container \"ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1\": container with ID starting with ee04ad1cf7c499ce4b9950f3f4e94c1f5af0befffdc17b93a406b37528a8edc1 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.330050 4954 scope.go:117] "RemoveContainer" containerID="e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.331276 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ff1ec67-d5a8-4612-874b-4324db52c148" (UID: "8ff1ec67-d5a8-4612-874b-4324db52c148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.349175 4954 scope.go:117] "RemoveContainer" containerID="e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.349567 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0\": container with ID starting with e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0 not found: ID does not exist" containerID="e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.349637 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0"} err="failed to get container status \"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0\": rpc error: code = NotFound desc = could not find container \"e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0\": container with ID starting with e2e5e4b4361999165d9eafbc30cf919effa4cd5383d4ed85b186c999547fd6c0 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.349658 4954 scope.go:117] "RemoveContainer" containerID="5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.363914 4954 scope.go:117] "RemoveContainer" containerID="f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.373081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config\") pod \"6faccf57-f78d-41c3-b176-691802bdc64b\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.373140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68lsz\" (UniqueName: \"kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz\") pod \"6faccf57-f78d-41c3-b176-691802bdc64b\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.373677 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert\") pod \"6faccf57-f78d-41c3-b176-691802bdc64b\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.373727 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca\") pod \"6faccf57-f78d-41c3-b176-691802bdc64b\" (UID: \"6faccf57-f78d-41c3-b176-691802bdc64b\") " Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.373888 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config" (OuterVolumeSpecName: "config") pod "6faccf57-f78d-41c3-b176-691802bdc64b" (UID: "6faccf57-f78d-41c3-b176-691802bdc64b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374251 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374271 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65plc\" (UniqueName: \"kubernetes.io/projected/8ff1ec67-d5a8-4612-874b-4324db52c148-kube-api-access-65plc\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374284 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374293 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd4vd\" (UniqueName: \"kubernetes.io/projected/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-kube-api-access-zd4vd\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374303 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374291 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6faccf57-f78d-41c3-b176-691802bdc64b" (UID: "6faccf57-f78d-41c3-b176-691802bdc64b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374311 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374376 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374389 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374400 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f5sl\" (UniqueName: \"kubernetes.io/projected/c8c116b3-5000-4043-a04f-ee79ff08a37d-kube-api-access-7f5sl\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374431 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.374442 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff1ec67-d5a8-4612-874b-4324db52c148-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.375967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz" (OuterVolumeSpecName: "kube-api-access-68lsz") pod "6faccf57-f78d-41c3-b176-691802bdc64b" (UID: "6faccf57-f78d-41c3-b176-691802bdc64b"). InnerVolumeSpecName "kube-api-access-68lsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.376328 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6faccf57-f78d-41c3-b176-691802bdc64b" (UID: "6faccf57-f78d-41c3-b176-691802bdc64b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.385374 4954 scope.go:117] "RemoveContainer" containerID="d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.402342 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8c116b3-5000-4043-a04f-ee79ff08a37d" (UID: "c8c116b3-5000-4043-a04f-ee79ff08a37d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.414128 4954 scope.go:117] "RemoveContainer" containerID="5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.415196 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd\": container with ID starting with 5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd not found: ID does not exist" containerID="5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.416692 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd"} err="failed to get container status \"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd\": rpc error: code = NotFound desc = could not find container \"5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd\": container with ID starting with 5a5e78672daf61ac2e94c996f178c22e8574436d2f1345a48ec7b123734cddbd not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.416807 4954 scope.go:117] "RemoveContainer" containerID="f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.417293 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1\": container with ID starting with f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1 not found: ID does not exist" containerID="f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.417329 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1"} err="failed to get container status \"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1\": rpc error: code = NotFound desc = could not find container \"f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1\": container with ID starting with f5a21adf079952ca6b460e066f6a04b83ff8f1463c64b79344348fe3907579d1 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.417353 4954 scope.go:117] "RemoveContainer" containerID="d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.417686 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f\": container with ID starting with d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f not found: ID does not exist" containerID="d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.417711 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f"} err="failed to get container status \"d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f\": rpc error: code = NotFound desc = could not find container \"d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f\": container with ID starting with d13504b516c3d79521e9ee8d3eb56daa31d8070e45c3daf5ae6f60a4190cab4f not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.417725 4954 scope.go:117] "RemoveContainer" containerID="799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.434449 4954 scope.go:117] "RemoveContainer" containerID="278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.448440 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txfqr"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.459294 4954 scope.go:117] "RemoveContainer" containerID="47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.476127 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c116b3-5000-4043-a04f-ee79ff08a37d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.476323 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68lsz\" (UniqueName: \"kubernetes.io/projected/6faccf57-f78d-41c3-b176-691802bdc64b-kube-api-access-68lsz\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.476414 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6faccf57-f78d-41c3-b176-691802bdc64b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.476500 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6faccf57-f78d-41c3-b176-691802bdc64b-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.479303 4954 scope.go:117] "RemoveContainer" containerID="799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.479946 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf\": container with ID starting with 799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf not found: ID does not exist" containerID="799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.480095 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf"} err="failed to get container status \"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf\": rpc error: code = NotFound desc = could not find container \"799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf\": container with ID starting with 799bc3be0805eb8f38ed3bd26d773dd52e0a1c406655de516bc15c847206adcf not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.480220 4954 scope.go:117] "RemoveContainer" containerID="278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.480778 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431\": container with ID starting with 278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431 not found: ID does not exist" containerID="278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.480896 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431"} err="failed to get container status \"278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431\": rpc error: code = NotFound desc = could not find container \"278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431\": container with ID starting with 278b2cadd8a23a424bc70457092ec2c6e497ab3682bfe32e194820a210d09431 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.481006 4954 scope.go:117] "RemoveContainer" containerID="47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.481469 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a\": container with ID starting with 47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a not found: ID does not exist" containerID="47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.481601 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a"} err="failed to get container status \"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a\": rpc error: code = NotFound desc = could not find container \"47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a\": container with ID starting with 47af7403f3bade6dbbca0af27c86b8825ad9fa398e78520456525649cf0bb62a not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.481718 4954 scope.go:117] "RemoveContainer" containerID="d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.575553 4954 scope.go:117] "RemoveContainer" containerID="ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.599756 4954 scope.go:117] "RemoveContainer" containerID="66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.629693 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.637377 4954 scope.go:117] "RemoveContainer" containerID="d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.640339 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vln6"] Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.641124 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e\": container with ID starting with d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e not found: ID does not exist" containerID="d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.641233 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e"} err="failed to get container status \"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e\": rpc error: code = NotFound desc = could not find container \"d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e\": container with ID starting with d6fcfaae76ae1a07837320c26260bc0b19e294c26ea88960aa8bc7de5069e23e not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.641324 4954 scope.go:117] "RemoveContainer" containerID="ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.641845 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a\": container with ID starting with ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a not found: ID does not exist" containerID="ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.641898 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a"} err="failed to get container status \"ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a\": rpc error: code = NotFound desc = could not find container \"ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a\": container with ID starting with ac8f0f9907924436c312b75d28264cab8e64cd3a23aca576771ebc4374a5b52a not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.641932 4954 scope.go:117] "RemoveContainer" containerID="66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10" Nov 27 16:44:22 crc kubenswrapper[4954]: E1127 16:44:22.644118 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10\": container with ID starting with 66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10 not found: ID does not exist" containerID="66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.644219 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10"} err="failed to get container status \"66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10\": rpc error: code = NotFound desc = could not find container \"66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10\": container with ID starting with 66e63bf400d80f73cd4765d997a4dd245444018f9298173e23641361d5b93c10 not found: ID does not exist" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.647623 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.652624 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f6b5bdb65-8bbfd"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.657744 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.660875 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4jtcn"] Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.668183 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09166f72-95b5-44d5-b265-705e11740e0c" path="/var/lib/kubelet/pods/09166f72-95b5-44d5-b265-705e11740e0c/volumes" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.668942 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" path="/var/lib/kubelet/pods/16e08fcc-f990-4f48-9f5f-3b0925cbf5c7/volumes" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.669465 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" path="/var/lib/kubelet/pods/51999cf2-62d7-4ee2-ae9f-b1ac606facb5/volumes" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.671129 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" path="/var/lib/kubelet/pods/8ff1ec67-d5a8-4612-874b-4324db52c148/volumes" Nov 27 16:44:22 crc kubenswrapper[4954]: I1127 16:44:22.671918 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" path="/var/lib/kubelet/pods/a08ef380-6670-415c-9861-71c9161f1a4c/volumes" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.251836 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhmbf" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.265549 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.265572 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb" event={"ID":"6faccf57-f78d-41c3-b176-691802bdc64b","Type":"ContainerDied","Data":"890e60a2d073d402650d4b61b15618cfa8482eae6f0b5a885fdeb9023033008c"} Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.265754 4954 scope.go:117] "RemoveContainer" containerID="642ad17e42f1fd5886933d369e39858fc8bcfb90eb80bba2d17a081cae87e7af" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.276052 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.281522 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vhmbf"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.289060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" event={"ID":"8215930a-947b-45d7-9c4e-9d867d3f234e","Type":"ContainerStarted","Data":"f4cc4bd5c2bcae93b79e6d341c71759a61c5880db9c147259888871ca6962bf5"} Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.289322 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" event={"ID":"8215930a-947b-45d7-9c4e-9d867d3f234e","Type":"ContainerStarted","Data":"0394febcad8e0bf54d4a5aac0937d4f38358b39777c187c69b2e77d70c184dd6"} Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.291775 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.295445 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.309924 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.314524 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fdf45db69-fr7nb"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.324203 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-txfqr" podStartSLOduration=2.32418332 podStartE2EDuration="2.32418332s" podCreationTimestamp="2025-11-27 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:44:23.324110069 +0000 UTC m=+375.341550369" watchObservedRunningTime="2025-11-27 16:44:23.32418332 +0000 UTC m=+375.341623620" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512647 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78cc549f78-w4wg2"] Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512836 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512847 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512861 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512867 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512874 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512880 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512888 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512894 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512901 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512908 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512923 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512931 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512941 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" containerName="controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512948 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" containerName="controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512957 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512964 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="extract-utilities" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512974 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512981 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512987 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.512992 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.512998 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513003 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.513012 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faccf57-f78d-41c3-b176-691802bdc64b" containerName="route-controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513017 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faccf57-f78d-41c3-b176-691802bdc64b" containerName="route-controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.513025 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513031 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.513039 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513044 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.513050 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513056 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="extract-content" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513148 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513159 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513167 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513175 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6faccf57-f78d-41c3-b176-691802bdc64b" containerName="route-controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513183 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e08fcc-f990-4f48-9f5f-3b0925cbf5c7" containerName="controller-manager" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513193 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff1ec67-d5a8-4612-874b-4324db52c148" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513202 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="51999cf2-62d7-4ee2-ae9f-b1ac606facb5" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513210 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="09166f72-95b5-44d5-b265-705e11740e0c" containerName="registry-server" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.513532 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.516411 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.516785 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.517055 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.517748 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.518050 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.518197 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.520261 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq"] Nov 27 16:44:23 crc kubenswrapper[4954]: E1127 16:44:23.520453 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.520470 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ef380-6670-415c-9861-71c9161f1a4c" containerName="marketplace-operator" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.520891 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.522178 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.522703 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.523185 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.523621 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.524421 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.529687 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.529947 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.532533 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78cc549f78-w4wg2"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.538072 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.580062 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5pxl"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.581426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.585633 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.586071 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5pxl"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-client-ca\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589106 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90cb5e74-c169-4b28-ba66-795161ff681e-serving-cert\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589136 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-config\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-config\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-serving-cert\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589227 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-proxy-ca-bundles\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkrg\" (UniqueName: \"kubernetes.io/projected/90cb5e74-c169-4b28-ba66-795161ff681e-kube-api-access-sdkrg\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9cwf\" (UniqueName: \"kubernetes.io/projected/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-kube-api-access-v9cwf\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.589432 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-client-ca\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.688097 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.688563 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.690857 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-client-ca\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90cb5e74-c169-4b28-ba66-795161ff681e-serving-cert\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691235 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-config\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-config\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691557 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-utilities\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-serving-cert\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.691997 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-catalog-content\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.692245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-proxy-ca-bundles\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.692454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkrg\" (UniqueName: \"kubernetes.io/projected/90cb5e74-c169-4b28-ba66-795161ff681e-kube-api-access-sdkrg\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.692706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9cwf\" (UniqueName: \"kubernetes.io/projected/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-kube-api-access-v9cwf\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.692933 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-client-ca\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.693132 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8jj\" (UniqueName: \"kubernetes.io/projected/84522a03-6ce9-4c9d-b5ee-786ec39f6555-kube-api-access-zj8jj\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.693028 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-config\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.693503 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-config\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.693841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-proxy-ca-bundles\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.696673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-client-ca\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.697227 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90cb5e74-c169-4b28-ba66-795161ff681e-client-ca\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.705903 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90cb5e74-c169-4b28-ba66-795161ff681e-serving-cert\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.705980 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-serving-cert\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.708130 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9cwf\" (UniqueName: \"kubernetes.io/projected/d7d273ec-7d1a-4e3f-a408-dfca1adbad1f-kube-api-access-v9cwf\") pod \"controller-manager-78cc549f78-w4wg2\" (UID: \"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f\") " pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.711640 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkrg\" (UniqueName: \"kubernetes.io/projected/90cb5e74-c169-4b28-ba66-795161ff681e-kube-api-access-sdkrg\") pod \"route-controller-manager-5d64d4b94b-hhpqq\" (UID: \"90cb5e74-c169-4b28-ba66-795161ff681e\") " pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.766747 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2t8bp"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.771175 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.774605 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2t8bp"] Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.776757 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.794603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-utilities\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.800809 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xj7\" (UniqueName: \"kubernetes.io/projected/7b582b55-6fc1-4a38-a30e-b192d35acdcc-kube-api-access-77xj7\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.800908 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-utilities\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.800950 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-catalog-content\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.801108 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-catalog-content\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.801504 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8jj\" (UniqueName: \"kubernetes.io/projected/84522a03-6ce9-4c9d-b5ee-786ec39f6555-kube-api-access-zj8jj\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.801710 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-utilities\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.802890 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84522a03-6ce9-4c9d-b5ee-786ec39f6555-catalog-content\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.823048 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8jj\" (UniqueName: \"kubernetes.io/projected/84522a03-6ce9-4c9d-b5ee-786ec39f6555-kube-api-access-zj8jj\") pod \"redhat-marketplace-r5pxl\" (UID: \"84522a03-6ce9-4c9d-b5ee-786ec39f6555\") " pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.852219 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.873606 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.902233 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.902983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-utilities\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.903075 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-catalog-content\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.903118 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77xj7\" (UniqueName: \"kubernetes.io/projected/7b582b55-6fc1-4a38-a30e-b192d35acdcc-kube-api-access-77xj7\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.905200 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-utilities\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.905341 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b582b55-6fc1-4a38-a30e-b192d35acdcc-catalog-content\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:23 crc kubenswrapper[4954]: I1127 16:44:23.923444 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xj7\" (UniqueName: \"kubernetes.io/projected/7b582b55-6fc1-4a38-a30e-b192d35acdcc-kube-api-access-77xj7\") pod \"redhat-operators-2t8bp\" (UID: \"7b582b55-6fc1-4a38-a30e-b192d35acdcc\") " pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.096014 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.204097 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5pxl"] Nov 27 16:44:24 crc kubenswrapper[4954]: W1127 16:44:24.217158 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84522a03_6ce9_4c9d_b5ee_786ec39f6555.slice/crio-98fc3598937f4821866a8b9c241c85b2040e882006c1ee53f1a715f863c99d51 WatchSource:0}: Error finding container 98fc3598937f4821866a8b9c241c85b2040e882006c1ee53f1a715f863c99d51: Status 404 returned error can't find the container with id 98fc3598937f4821866a8b9c241c85b2040e882006c1ee53f1a715f863c99d51 Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.304919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5pxl" event={"ID":"84522a03-6ce9-4c9d-b5ee-786ec39f6555","Type":"ContainerStarted","Data":"98fc3598937f4821866a8b9c241c85b2040e882006c1ee53f1a715f863c99d51"} Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.319634 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2t8bp"] Nov 27 16:44:24 crc kubenswrapper[4954]: W1127 16:44:24.332483 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b582b55_6fc1_4a38_a30e_b192d35acdcc.slice/crio-5cc6f2c01f12ddef6c49b64eaea8db15c20f9c1120a026ce3c338bbdc080a913 WatchSource:0}: Error finding container 5cc6f2c01f12ddef6c49b64eaea8db15c20f9c1120a026ce3c338bbdc080a913: Status 404 returned error can't find the container with id 5cc6f2c01f12ddef6c49b64eaea8db15c20f9c1120a026ce3c338bbdc080a913 Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.356414 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78cc549f78-w4wg2"] Nov 27 16:44:24 crc kubenswrapper[4954]: W1127 16:44:24.357893 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d273ec_7d1a_4e3f_a408_dfca1adbad1f.slice/crio-c701d207ee50fe458dccf6539c844cf89b18921ab85b0d9e459e3897ba4f9e12 WatchSource:0}: Error finding container c701d207ee50fe458dccf6539c844cf89b18921ab85b0d9e459e3897ba4f9e12: Status 404 returned error can't find the container with id c701d207ee50fe458dccf6539c844cf89b18921ab85b0d9e459e3897ba4f9e12 Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.376941 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq"] Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.670493 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6faccf57-f78d-41c3-b176-691802bdc64b" path="/var/lib/kubelet/pods/6faccf57-f78d-41c3-b176-691802bdc64b/volumes" Nov 27 16:44:24 crc kubenswrapper[4954]: I1127 16:44:24.671571 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c116b3-5000-4043-a04f-ee79ff08a37d" path="/var/lib/kubelet/pods/c8c116b3-5000-4043-a04f-ee79ff08a37d/volumes" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.321011 4954 generic.go:334] "Generic (PLEG): container finished" podID="84522a03-6ce9-4c9d-b5ee-786ec39f6555" containerID="6bdd066574362f2f59645c0a04e28d097808b4cec2822683b5b34646cd32c286" exitCode=0 Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.321075 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5pxl" event={"ID":"84522a03-6ce9-4c9d-b5ee-786ec39f6555","Type":"ContainerDied","Data":"6bdd066574362f2f59645c0a04e28d097808b4cec2822683b5b34646cd32c286"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.325795 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" event={"ID":"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f","Type":"ContainerStarted","Data":"ef99def182b743cc225e843a0735a3c5d3c9e658c0633aa9c10c572dbbc97ad6"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.325860 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" event={"ID":"d7d273ec-7d1a-4e3f-a408-dfca1adbad1f","Type":"ContainerStarted","Data":"c701d207ee50fe458dccf6539c844cf89b18921ab85b0d9e459e3897ba4f9e12"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.325988 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.330565 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.330789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" event={"ID":"90cb5e74-c169-4b28-ba66-795161ff681e","Type":"ContainerStarted","Data":"9c0424ab18871016b14508ffdfc5cff575e7acf7f369e10982b16df85fcba065"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.330868 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" event={"ID":"90cb5e74-c169-4b28-ba66-795161ff681e","Type":"ContainerStarted","Data":"369d06ee9c5bd44adf5d7f7c1263c1239c6b2099cef2fc126fb74e50a95c2863"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.331008 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.332457 4954 generic.go:334] "Generic (PLEG): container finished" podID="7b582b55-6fc1-4a38-a30e-b192d35acdcc" containerID="48f30ac4b2f49ee86d994a36717e21cb425d48eda923bcafca1ba3b4d173a214" exitCode=0 Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.332591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2t8bp" event={"ID":"7b582b55-6fc1-4a38-a30e-b192d35acdcc","Type":"ContainerDied","Data":"48f30ac4b2f49ee86d994a36717e21cb425d48eda923bcafca1ba3b4d173a214"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.332624 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2t8bp" event={"ID":"7b582b55-6fc1-4a38-a30e-b192d35acdcc","Type":"ContainerStarted","Data":"5cc6f2c01f12ddef6c49b64eaea8db15c20f9c1120a026ce3c338bbdc080a913"} Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.338120 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.362699 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78cc549f78-w4wg2" podStartSLOduration=4.362677863 podStartE2EDuration="4.362677863s" podCreationTimestamp="2025-11-27 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:44:25.360624932 +0000 UTC m=+377.378065232" watchObservedRunningTime="2025-11-27 16:44:25.362677863 +0000 UTC m=+377.380118163" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.409035 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d64d4b94b-hhpqq" podStartSLOduration=4.409011304 podStartE2EDuration="4.409011304s" podCreationTimestamp="2025-11-27 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:44:25.403100645 +0000 UTC m=+377.420540945" watchObservedRunningTime="2025-11-27 16:44:25.409011304 +0000 UTC m=+377.426451614" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.966141 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.972178 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.975552 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:44:25 crc kubenswrapper[4954]: I1127 16:44:25.991817 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.036985 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.037026 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.037062 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjtr\" (UniqueName: \"kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.138115 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.138522 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.138555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjtr\" (UniqueName: \"kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.138805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.139001 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.162536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjtr\" (UniqueName: \"kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr\") pod \"community-operators-xj2hb\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.166926 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vxhf"] Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.168190 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.170681 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.182284 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vxhf"] Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.239878 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvlz\" (UniqueName: \"kubernetes.io/projected/a06676d3-037c-4529-926c-0624a5e647ee-kube-api-access-wdvlz\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.239922 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-catalog-content\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.239955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-utilities\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.314669 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.341277 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2t8bp" event={"ID":"7b582b55-6fc1-4a38-a30e-b192d35acdcc","Type":"ContainerStarted","Data":"1ea98d6b93c681827dcfc95507dcb0f18ec1f734e1e59f1cc45cc97a1df2cba1"} Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.342334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvlz\" (UniqueName: \"kubernetes.io/projected/a06676d3-037c-4529-926c-0624a5e647ee-kube-api-access-wdvlz\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.342382 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-catalog-content\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.342421 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-utilities\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.344169 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-catalog-content\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.345429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5pxl" event={"ID":"84522a03-6ce9-4c9d-b5ee-786ec39f6555","Type":"ContainerStarted","Data":"336ce7da55c0f3b5c27e093171046e83a9638239f00fcd5d569136b1998ef047"} Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.349266 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06676d3-037c-4529-926c-0624a5e647ee-utilities\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.375954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvlz\" (UniqueName: \"kubernetes.io/projected/a06676d3-037c-4529-926c-0624a5e647ee-kube-api-access-wdvlz\") pod \"certified-operators-9vxhf\" (UID: \"a06676d3-037c-4529-926c-0624a5e647ee\") " pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.516125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.581500 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 16:44:26 crc kubenswrapper[4954]: W1127 16:44:26.596245 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ec19b6_189a_4163_ae87_1c95809ad7d3.slice/crio-67ef586721f3fcbe720b4f69bfb490c1e575563df749263db37a92b49fdbb956 WatchSource:0}: Error finding container 67ef586721f3fcbe720b4f69bfb490c1e575563df749263db37a92b49fdbb956: Status 404 returned error can't find the container with id 67ef586721f3fcbe720b4f69bfb490c1e575563df749263db37a92b49fdbb956 Nov 27 16:44:26 crc kubenswrapper[4954]: I1127 16:44:26.761250 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vxhf"] Nov 27 16:44:26 crc kubenswrapper[4954]: W1127 16:44:26.763205 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda06676d3_037c_4529_926c_0624a5e647ee.slice/crio-3eef2279536f334cf04b6dc06e9005731655afa96d9989e4e87ea61527a160bd WatchSource:0}: Error finding container 3eef2279536f334cf04b6dc06e9005731655afa96d9989e4e87ea61527a160bd: Status 404 returned error can't find the container with id 3eef2279536f334cf04b6dc06e9005731655afa96d9989e4e87ea61527a160bd Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.354557 4954 generic.go:334] "Generic (PLEG): container finished" podID="84522a03-6ce9-4c9d-b5ee-786ec39f6555" containerID="336ce7da55c0f3b5c27e093171046e83a9638239f00fcd5d569136b1998ef047" exitCode=0 Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.354700 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5pxl" event={"ID":"84522a03-6ce9-4c9d-b5ee-786ec39f6555","Type":"ContainerDied","Data":"336ce7da55c0f3b5c27e093171046e83a9638239f00fcd5d569136b1998ef047"} Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.367009 4954 generic.go:334] "Generic (PLEG): container finished" podID="a06676d3-037c-4529-926c-0624a5e647ee" containerID="50beb75d332574a3ec820721f77f4e2eedb1a9b55e00aed3f4dd12593b75a129" exitCode=0 Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.367171 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vxhf" event={"ID":"a06676d3-037c-4529-926c-0624a5e647ee","Type":"ContainerDied","Data":"50beb75d332574a3ec820721f77f4e2eedb1a9b55e00aed3f4dd12593b75a129"} Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.367204 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vxhf" event={"ID":"a06676d3-037c-4529-926c-0624a5e647ee","Type":"ContainerStarted","Data":"3eef2279536f334cf04b6dc06e9005731655afa96d9989e4e87ea61527a160bd"} Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.368927 4954 generic.go:334] "Generic (PLEG): container finished" podID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerID="7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1" exitCode=0 Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.369021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerDied","Data":"7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1"} Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.369061 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerStarted","Data":"67ef586721f3fcbe720b4f69bfb490c1e575563df749263db37a92b49fdbb956"} Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.381508 4954 generic.go:334] "Generic (PLEG): container finished" podID="7b582b55-6fc1-4a38-a30e-b192d35acdcc" containerID="1ea98d6b93c681827dcfc95507dcb0f18ec1f734e1e59f1cc45cc97a1df2cba1" exitCode=0 Nov 27 16:44:27 crc kubenswrapper[4954]: I1127 16:44:27.381828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2t8bp" event={"ID":"7b582b55-6fc1-4a38-a30e-b192d35acdcc","Type":"ContainerDied","Data":"1ea98d6b93c681827dcfc95507dcb0f18ec1f734e1e59f1cc45cc97a1df2cba1"} Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.389547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5pxl" event={"ID":"84522a03-6ce9-4c9d-b5ee-786ec39f6555","Type":"ContainerStarted","Data":"364d2677579858a1c44d0f113c569e1ce2448384671290e826686331ec25b676"} Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.391744 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vxhf" event={"ID":"a06676d3-037c-4529-926c-0624a5e647ee","Type":"ContainerStarted","Data":"4b81dc0230e582f97c5efce2b2fcc7fdab25e8a98d5b393c244eac79921d4ed1"} Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.394466 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerStarted","Data":"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4"} Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.397421 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2t8bp" event={"ID":"7b582b55-6fc1-4a38-a30e-b192d35acdcc","Type":"ContainerStarted","Data":"7d4ddfd419b15cad0727805577cba70f39fba4e49462fb5ac49732c46bf411b1"} Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.411351 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5pxl" podStartSLOduration=2.933585115 podStartE2EDuration="5.411335456s" podCreationTimestamp="2025-11-27 16:44:23 +0000 UTC" firstStartedPulling="2025-11-27 16:44:25.34019676 +0000 UTC m=+377.357637060" lastFinishedPulling="2025-11-27 16:44:27.817947061 +0000 UTC m=+379.835387401" observedRunningTime="2025-11-27 16:44:28.40868483 +0000 UTC m=+380.426125130" watchObservedRunningTime="2025-11-27 16:44:28.411335456 +0000 UTC m=+380.428775746" Nov 27 16:44:28 crc kubenswrapper[4954]: I1127 16:44:28.459867 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2t8bp" podStartSLOduration=2.932837968 podStartE2EDuration="5.459848651s" podCreationTimestamp="2025-11-27 16:44:23 +0000 UTC" firstStartedPulling="2025-11-27 16:44:25.334287253 +0000 UTC m=+377.351727573" lastFinishedPulling="2025-11-27 16:44:27.861297946 +0000 UTC m=+379.878738256" observedRunningTime="2025-11-27 16:44:28.455220925 +0000 UTC m=+380.472661225" watchObservedRunningTime="2025-11-27 16:44:28.459848651 +0000 UTC m=+380.477288951" Nov 27 16:44:29 crc kubenswrapper[4954]: I1127 16:44:29.406662 4954 generic.go:334] "Generic (PLEG): container finished" podID="a06676d3-037c-4529-926c-0624a5e647ee" containerID="4b81dc0230e582f97c5efce2b2fcc7fdab25e8a98d5b393c244eac79921d4ed1" exitCode=0 Nov 27 16:44:29 crc kubenswrapper[4954]: I1127 16:44:29.406753 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vxhf" event={"ID":"a06676d3-037c-4529-926c-0624a5e647ee","Type":"ContainerDied","Data":"4b81dc0230e582f97c5efce2b2fcc7fdab25e8a98d5b393c244eac79921d4ed1"} Nov 27 16:44:29 crc kubenswrapper[4954]: I1127 16:44:29.410300 4954 generic.go:334] "Generic (PLEG): container finished" podID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerID="ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4" exitCode=0 Nov 27 16:44:29 crc kubenswrapper[4954]: I1127 16:44:29.410482 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerDied","Data":"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4"} Nov 27 16:44:30 crc kubenswrapper[4954]: I1127 16:44:30.417603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vxhf" event={"ID":"a06676d3-037c-4529-926c-0624a5e647ee","Type":"ContainerStarted","Data":"8d2ccdb65436241817b996b051390baf0afd93713e19e0402094d465cb4ebf6e"} Nov 27 16:44:30 crc kubenswrapper[4954]: I1127 16:44:30.419888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerStarted","Data":"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1"} Nov 27 16:44:30 crc kubenswrapper[4954]: I1127 16:44:30.444646 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vxhf" podStartSLOduration=1.974702276 podStartE2EDuration="4.444632s" podCreationTimestamp="2025-11-27 16:44:26 +0000 UTC" firstStartedPulling="2025-11-27 16:44:27.369358741 +0000 UTC m=+379.386799041" lastFinishedPulling="2025-11-27 16:44:29.839288475 +0000 UTC m=+381.856728765" observedRunningTime="2025-11-27 16:44:30.443006819 +0000 UTC m=+382.460447119" watchObservedRunningTime="2025-11-27 16:44:30.444632 +0000 UTC m=+382.462072300" Nov 27 16:44:30 crc kubenswrapper[4954]: I1127 16:44:30.467330 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xj2hb" podStartSLOduration=2.817676803 podStartE2EDuration="5.467312837s" podCreationTimestamp="2025-11-27 16:44:25 +0000 UTC" firstStartedPulling="2025-11-27 16:44:27.370471808 +0000 UTC m=+379.387912138" lastFinishedPulling="2025-11-27 16:44:30.020107872 +0000 UTC m=+382.037548172" observedRunningTime="2025-11-27 16:44:30.466868146 +0000 UTC m=+382.484308476" watchObservedRunningTime="2025-11-27 16:44:30.467312837 +0000 UTC m=+382.484753137" Nov 27 16:44:33 crc kubenswrapper[4954]: I1127 16:44:33.902500 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:33 crc kubenswrapper[4954]: I1127 16:44:33.903476 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:33 crc kubenswrapper[4954]: I1127 16:44:33.986222 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:34 crc kubenswrapper[4954]: I1127 16:44:34.098337 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:34 crc kubenswrapper[4954]: I1127 16:44:34.098397 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:34 crc kubenswrapper[4954]: I1127 16:44:34.163753 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:34 crc kubenswrapper[4954]: I1127 16:44:34.492466 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5pxl" Nov 27 16:44:34 crc kubenswrapper[4954]: I1127 16:44:34.500830 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2t8bp" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.315464 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.315863 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.364694 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.506793 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.516757 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.516804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:36 crc kubenswrapper[4954]: I1127 16:44:36.587270 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:38 crc kubenswrapper[4954]: I1127 16:44:38.467746 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vxhf" Nov 27 16:44:53 crc kubenswrapper[4954]: I1127 16:44:53.688328 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:44:53 crc kubenswrapper[4954]: I1127 16:44:53.689330 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.204987 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw"] Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.206448 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.209515 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.211659 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.223215 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw"] Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.386449 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.386563 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.386684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8nm2\" (UniqueName: \"kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.487932 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.488000 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.488044 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8nm2\" (UniqueName: \"kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.489953 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.502507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.520464 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8nm2\" (UniqueName: \"kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2\") pod \"collect-profiles-29404365-nqxnw\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:00 crc kubenswrapper[4954]: I1127 16:45:00.540683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:01 crc kubenswrapper[4954]: I1127 16:45:01.025628 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw"] Nov 27 16:45:01 crc kubenswrapper[4954]: I1127 16:45:01.643295 4954 generic.go:334] "Generic (PLEG): container finished" podID="bc84ac35-9748-4149-a879-fd4aa19ab5fd" containerID="60d34756ad0eebc3153c05ba00e39e84d43c1f80d5cbfccd301ba32a6a47890a" exitCode=0 Nov 27 16:45:01 crc kubenswrapper[4954]: I1127 16:45:01.643356 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" event={"ID":"bc84ac35-9748-4149-a879-fd4aa19ab5fd","Type":"ContainerDied","Data":"60d34756ad0eebc3153c05ba00e39e84d43c1f80d5cbfccd301ba32a6a47890a"} Nov 27 16:45:01 crc kubenswrapper[4954]: I1127 16:45:01.643386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" event={"ID":"bc84ac35-9748-4149-a879-fd4aa19ab5fd","Type":"ContainerStarted","Data":"e4971bc39b2fc8300ef72a03843235b0258a569f2ab5caebb1b82786500d0df5"} Nov 27 16:45:02 crc kubenswrapper[4954]: I1127 16:45:02.970778 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.126728 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume\") pod \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.126820 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8nm2\" (UniqueName: \"kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2\") pod \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.126866 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume\") pod \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\" (UID: \"bc84ac35-9748-4149-a879-fd4aa19ab5fd\") " Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.127995 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc84ac35-9748-4149-a879-fd4aa19ab5fd" (UID: "bc84ac35-9748-4149-a879-fd4aa19ab5fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.135443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc84ac35-9748-4149-a879-fd4aa19ab5fd" (UID: "bc84ac35-9748-4149-a879-fd4aa19ab5fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.135870 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2" (OuterVolumeSpecName: "kube-api-access-h8nm2") pod "bc84ac35-9748-4149-a879-fd4aa19ab5fd" (UID: "bc84ac35-9748-4149-a879-fd4aa19ab5fd"). InnerVolumeSpecName "kube-api-access-h8nm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.229936 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc84ac35-9748-4149-a879-fd4aa19ab5fd-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.230016 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8nm2\" (UniqueName: \"kubernetes.io/projected/bc84ac35-9748-4149-a879-fd4aa19ab5fd-kube-api-access-h8nm2\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.230038 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc84ac35-9748-4149-a879-fd4aa19ab5fd-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.661503 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" event={"ID":"bc84ac35-9748-4149-a879-fd4aa19ab5fd","Type":"ContainerDied","Data":"e4971bc39b2fc8300ef72a03843235b0258a569f2ab5caebb1b82786500d0df5"} Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.662041 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4971bc39b2fc8300ef72a03843235b0258a569f2ab5caebb1b82786500d0df5" Nov 27 16:45:03 crc kubenswrapper[4954]: I1127 16:45:03.661627 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw" Nov 27 16:45:23 crc kubenswrapper[4954]: I1127 16:45:23.687297 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:45:23 crc kubenswrapper[4954]: I1127 16:45:23.687912 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:45:23 crc kubenswrapper[4954]: I1127 16:45:23.687977 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:45:23 crc kubenswrapper[4954]: I1127 16:45:23.688638 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:45:23 crc kubenswrapper[4954]: I1127 16:45:23.688685 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4" gracePeriod=600 Nov 27 16:45:24 crc kubenswrapper[4954]: I1127 16:45:24.811466 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4" exitCode=0 Nov 27 16:45:24 crc kubenswrapper[4954]: I1127 16:45:24.811571 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4"} Nov 27 16:45:24 crc kubenswrapper[4954]: I1127 16:45:24.812539 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad"} Nov 27 16:45:24 crc kubenswrapper[4954]: I1127 16:45:24.812571 4954 scope.go:117] "RemoveContainer" containerID="abf93a27d369fc02df1a4508748705f9bbad044d52db659f35896e60e7a8bdf9" Nov 27 16:47:08 crc kubenswrapper[4954]: I1127 16:47:08.850070 4954 scope.go:117] "RemoveContainer" containerID="ce7951a9306b662396c84e314cad126080d4ed8fb027a5c3883f10c25c66cea7" Nov 27 16:47:08 crc kubenswrapper[4954]: I1127 16:47:08.889226 4954 scope.go:117] "RemoveContainer" containerID="f467d62914eade0f151113915f0669ca492deef458ab407c5bef188eaf9a166c" Nov 27 16:47:08 crc kubenswrapper[4954]: I1127 16:47:08.915459 4954 scope.go:117] "RemoveContainer" containerID="6547e759deff8e30fc1e79e0eedb93c09d151c2ebec7c5b5be06cef23d58ee02" Nov 27 16:47:23 crc kubenswrapper[4954]: I1127 16:47:23.687918 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:47:23 crc kubenswrapper[4954]: I1127 16:47:23.688831 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:47:53 crc kubenswrapper[4954]: I1127 16:47:53.687892 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:47:53 crc kubenswrapper[4954]: I1127 16:47:53.689114 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.251273 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hqbv8"] Nov 27 16:48:00 crc kubenswrapper[4954]: E1127 16:48:00.251855 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc84ac35-9748-4149-a879-fd4aa19ab5fd" containerName="collect-profiles" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.251868 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc84ac35-9748-4149-a879-fd4aa19ab5fd" containerName="collect-profiles" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.251963 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc84ac35-9748-4149-a879-fd4aa19ab5fd" containerName="collect-profiles" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.252366 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.276859 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hqbv8"] Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330300 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/049553a9-f3bf-4e78-8e44-4209f1a1cef6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330356 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-trusted-ca\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330387 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzkx\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-kube-api-access-nrzkx\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-certificates\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330444 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-bound-sa-token\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330505 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-tls\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.330526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/049553a9-f3bf-4e78-8e44-4209f1a1cef6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.356860 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.431883 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-bound-sa-token\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.432360 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-tls\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.432476 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/049553a9-f3bf-4e78-8e44-4209f1a1cef6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.432661 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/049553a9-f3bf-4e78-8e44-4209f1a1cef6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.432808 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-trusted-ca\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.432927 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzkx\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-kube-api-access-nrzkx\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.433036 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-certificates\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.433652 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/049553a9-f3bf-4e78-8e44-4209f1a1cef6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.434678 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-certificates\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.435019 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/049553a9-f3bf-4e78-8e44-4209f1a1cef6-trusted-ca\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.439811 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/049553a9-f3bf-4e78-8e44-4209f1a1cef6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.441358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-registry-tls\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.452524 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-bound-sa-token\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.454471 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzkx\" (UniqueName: \"kubernetes.io/projected/049553a9-f3bf-4e78-8e44-4209f1a1cef6-kube-api-access-nrzkx\") pod \"image-registry-66df7c8f76-hqbv8\" (UID: \"049553a9-f3bf-4e78-8e44-4209f1a1cef6\") " pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:00 crc kubenswrapper[4954]: I1127 16:48:00.572944 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:01 crc kubenswrapper[4954]: I1127 16:48:01.106022 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hqbv8"] Nov 27 16:48:01 crc kubenswrapper[4954]: I1127 16:48:01.175176 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" event={"ID":"049553a9-f3bf-4e78-8e44-4209f1a1cef6","Type":"ContainerStarted","Data":"b827a0772942818e742be42c3d56a362e7b77cbc36e5bd9bb4bb859754f439c5"} Nov 27 16:48:02 crc kubenswrapper[4954]: I1127 16:48:02.187141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" event={"ID":"049553a9-f3bf-4e78-8e44-4209f1a1cef6","Type":"ContainerStarted","Data":"c7ab5e90e0e4dd491514ca30f86d16d4b837355e9f9f20edbcaecf280b5455f6"} Nov 27 16:48:02 crc kubenswrapper[4954]: I1127 16:48:02.189862 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:02 crc kubenswrapper[4954]: I1127 16:48:02.224157 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" podStartSLOduration=2.224121452 podStartE2EDuration="2.224121452s" podCreationTimestamp="2025-11-27 16:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:48:02.2220092 +0000 UTC m=+594.239449550" watchObservedRunningTime="2025-11-27 16:48:02.224121452 +0000 UTC m=+594.241561752" Nov 27 16:48:08 crc kubenswrapper[4954]: I1127 16:48:08.993763 4954 scope.go:117] "RemoveContainer" containerID="1a7f5493e4c79ca3f27e8085b89da595ae48972a91b2e7b072d1cb44fc153403" Nov 27 16:48:09 crc kubenswrapper[4954]: I1127 16:48:09.026535 4954 scope.go:117] "RemoveContainer" containerID="029f0d1f4d989692e56e0cb2fd1ee651dd41455f80232b45bb38a073df2139e7" Nov 27 16:48:20 crc kubenswrapper[4954]: I1127 16:48:20.582011 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hqbv8" Nov 27 16:48:20 crc kubenswrapper[4954]: I1127 16:48:20.678780 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:48:23 crc kubenswrapper[4954]: I1127 16:48:23.688545 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:48:23 crc kubenswrapper[4954]: I1127 16:48:23.691729 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:48:23 crc kubenswrapper[4954]: I1127 16:48:23.692020 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:48:23 crc kubenswrapper[4954]: I1127 16:48:23.693459 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:48:23 crc kubenswrapper[4954]: I1127 16:48:23.693828 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad" gracePeriod=600 Nov 27 16:48:24 crc kubenswrapper[4954]: I1127 16:48:24.405463 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad" exitCode=0 Nov 27 16:48:24 crc kubenswrapper[4954]: I1127 16:48:24.405626 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad"} Nov 27 16:48:24 crc kubenswrapper[4954]: I1127 16:48:24.406132 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd"} Nov 27 16:48:24 crc kubenswrapper[4954]: I1127 16:48:24.406173 4954 scope.go:117] "RemoveContainer" containerID="84093c2be17ea05c39ef1d4a336c22ac7f26980534017a56732b693c785209f4" Nov 27 16:48:45 crc kubenswrapper[4954]: I1127 16:48:45.756835 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" podUID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" containerName="registry" containerID="cri-o://f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4" gracePeriod=30 Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.215298 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347711 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347809 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347853 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347878 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjttn\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347937 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.347999 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.348313 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\" (UID: \"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd\") " Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.349861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.350718 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.357688 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.360012 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.360710 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn" (OuterVolumeSpecName: "kube-api-access-xjttn") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "kube-api-access-xjttn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.370343 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.374502 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.389524 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" (UID: "7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.449966 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450029 4954 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450051 4954 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450069 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjttn\" (UniqueName: \"kubernetes.io/projected/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-kube-api-access-xjttn\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450095 4954 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450113 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.450129 4954 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.582805 4954 generic.go:334] "Generic (PLEG): container finished" podID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" containerID="f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4" exitCode=0 Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.582878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" event={"ID":"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd","Type":"ContainerDied","Data":"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4"} Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.582940 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" event={"ID":"7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd","Type":"ContainerDied","Data":"3f9baf829b8310f171198e0d01ddb40e075fb95c7e6415706bba933e15254d3e"} Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.582989 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n2fzm" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.583042 4954 scope.go:117] "RemoveContainer" containerID="f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.611629 4954 scope.go:117] "RemoveContainer" containerID="f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4" Nov 27 16:48:46 crc kubenswrapper[4954]: E1127 16:48:46.612315 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4\": container with ID starting with f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4 not found: ID does not exist" containerID="f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.612391 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4"} err="failed to get container status \"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4\": rpc error: code = NotFound desc = could not find container \"f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4\": container with ID starting with f3fc4a4331f0d1f3a288fcd7e7dfc47a8a3010c1c4235825e890159ac8b6a8c4 not found: ID does not exist" Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.647341 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.655539 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n2fzm"] Nov 27 16:48:46 crc kubenswrapper[4954]: I1127 16:48:46.677676 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" path="/var/lib/kubelet/pods/7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd/volumes" Nov 27 16:50:42 crc kubenswrapper[4954]: I1127 16:50:42.274650 4954 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:50:53 crc kubenswrapper[4954]: I1127 16:50:53.687773 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:50:53 crc kubenswrapper[4954]: I1127 16:50:53.689026 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:51:23 crc kubenswrapper[4954]: I1127 16:51:23.688104 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:51:23 crc kubenswrapper[4954]: I1127 16:51:23.689242 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.938572 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2t966"] Nov 27 16:51:37 crc kubenswrapper[4954]: E1127 16:51:37.939557 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" containerName="registry" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.939574 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" containerName="registry" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.939759 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c21cf7d-2d3b-4db6-9d18-c3303fbfa0bd" containerName="registry" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.940255 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2t966" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.943865 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wc8sh" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.944907 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-96dn8"] Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.945907 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.964080 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.964243 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.964700 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rqrf8" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.981489 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2t966"] Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.988506 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ghjsr"] Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.990822 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:37 crc kubenswrapper[4954]: I1127 16:51:37.996136 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nj47t" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.008682 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-96dn8"] Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.020395 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ghjsr"] Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.111178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfcf7\" (UniqueName: \"kubernetes.io/projected/f6f24261-8d7e-454f-8d20-2a35f12114c6-kube-api-access-lfcf7\") pod \"cert-manager-cainjector-7f985d654d-96dn8\" (UID: \"f6f24261-8d7e-454f-8d20-2a35f12114c6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.111274 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzg8\" (UniqueName: \"kubernetes.io/projected/04065317-2688-429e-8362-970a4f083d14-kube-api-access-9mzg8\") pod \"cert-manager-webhook-5655c58dd6-ghjsr\" (UID: \"04065317-2688-429e-8362-970a4f083d14\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.111537 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkw7\" (UniqueName: \"kubernetes.io/projected/51fb16a6-3c9e-4cca-a603-8b71f0b91ee1-kube-api-access-rvkw7\") pod \"cert-manager-5b446d88c5-2t966\" (UID: \"51fb16a6-3c9e-4cca-a603-8b71f0b91ee1\") " pod="cert-manager/cert-manager-5b446d88c5-2t966" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.213764 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfcf7\" (UniqueName: \"kubernetes.io/projected/f6f24261-8d7e-454f-8d20-2a35f12114c6-kube-api-access-lfcf7\") pod \"cert-manager-cainjector-7f985d654d-96dn8\" (UID: \"f6f24261-8d7e-454f-8d20-2a35f12114c6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.213851 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzg8\" (UniqueName: \"kubernetes.io/projected/04065317-2688-429e-8362-970a4f083d14-kube-api-access-9mzg8\") pod \"cert-manager-webhook-5655c58dd6-ghjsr\" (UID: \"04065317-2688-429e-8362-970a4f083d14\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.213916 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkw7\" (UniqueName: \"kubernetes.io/projected/51fb16a6-3c9e-4cca-a603-8b71f0b91ee1-kube-api-access-rvkw7\") pod \"cert-manager-5b446d88c5-2t966\" (UID: \"51fb16a6-3c9e-4cca-a603-8b71f0b91ee1\") " pod="cert-manager/cert-manager-5b446d88c5-2t966" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.234592 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkw7\" (UniqueName: \"kubernetes.io/projected/51fb16a6-3c9e-4cca-a603-8b71f0b91ee1-kube-api-access-rvkw7\") pod \"cert-manager-5b446d88c5-2t966\" (UID: \"51fb16a6-3c9e-4cca-a603-8b71f0b91ee1\") " pod="cert-manager/cert-manager-5b446d88c5-2t966" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.234922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfcf7\" (UniqueName: \"kubernetes.io/projected/f6f24261-8d7e-454f-8d20-2a35f12114c6-kube-api-access-lfcf7\") pod \"cert-manager-cainjector-7f985d654d-96dn8\" (UID: \"f6f24261-8d7e-454f-8d20-2a35f12114c6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.243535 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzg8\" (UniqueName: \"kubernetes.io/projected/04065317-2688-429e-8362-970a4f083d14-kube-api-access-9mzg8\") pod \"cert-manager-webhook-5655c58dd6-ghjsr\" (UID: \"04065317-2688-429e-8362-970a4f083d14\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.261977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2t966" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.274661 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.314645 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.491467 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2t966"] Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.504305 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.532427 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-96dn8"] Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.586073 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ghjsr"] Nov 27 16:51:38 crc kubenswrapper[4954]: W1127 16:51:38.587417 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04065317_2688_429e_8362_970a4f083d14.slice/crio-6124669836e42349b1bcd39c5c6b6a0c318e581102d0a96b464933143a326aa4 WatchSource:0}: Error finding container 6124669836e42349b1bcd39c5c6b6a0c318e581102d0a96b464933143a326aa4: Status 404 returned error can't find the container with id 6124669836e42349b1bcd39c5c6b6a0c318e581102d0a96b464933143a326aa4 Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.910184 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" event={"ID":"04065317-2688-429e-8362-970a4f083d14","Type":"ContainerStarted","Data":"6124669836e42349b1bcd39c5c6b6a0c318e581102d0a96b464933143a326aa4"} Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.911543 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" event={"ID":"f6f24261-8d7e-454f-8d20-2a35f12114c6","Type":"ContainerStarted","Data":"4e344063098c8fdc697caa518b7d81c15fb89be12ec071014eea4cc1edfc49d4"} Nov 27 16:51:38 crc kubenswrapper[4954]: I1127 16:51:38.913015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2t966" event={"ID":"51fb16a6-3c9e-4cca-a603-8b71f0b91ee1","Type":"ContainerStarted","Data":"193094b820f0a03d3803a11a384b7aaed89c11e4e7a4204c13ad72afa2de449d"} Nov 27 16:51:42 crc kubenswrapper[4954]: I1127 16:51:42.959593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" event={"ID":"04065317-2688-429e-8362-970a4f083d14","Type":"ContainerStarted","Data":"cd2b0d8b6896240685a9c121abec24ba2d924b41d718440878c0ae856ae2a199"} Nov 27 16:51:42 crc kubenswrapper[4954]: I1127 16:51:42.960044 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:42 crc kubenswrapper[4954]: I1127 16:51:42.962417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" event={"ID":"f6f24261-8d7e-454f-8d20-2a35f12114c6","Type":"ContainerStarted","Data":"b15260a4c5c4ad0ef25d73def47d6beb55a8147863765fd8e40a889e556b770d"} Nov 27 16:51:42 crc kubenswrapper[4954]: I1127 16:51:42.964898 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2t966" event={"ID":"51fb16a6-3c9e-4cca-a603-8b71f0b91ee1","Type":"ContainerStarted","Data":"82f266b3edc31046250aac140450157571e926c8e89259785d5a71f9fa37a257"} Nov 27 16:51:43 crc kubenswrapper[4954]: I1127 16:51:43.022082 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-2t966" podStartSLOduration=2.054919264 podStartE2EDuration="6.022052454s" podCreationTimestamp="2025-11-27 16:51:37 +0000 UTC" firstStartedPulling="2025-11-27 16:51:38.503988356 +0000 UTC m=+810.521428656" lastFinishedPulling="2025-11-27 16:51:42.471121506 +0000 UTC m=+814.488561846" observedRunningTime="2025-11-27 16:51:43.017993184 +0000 UTC m=+815.035433494" watchObservedRunningTime="2025-11-27 16:51:43.022052454 +0000 UTC m=+815.039492764" Nov 27 16:51:43 crc kubenswrapper[4954]: I1127 16:51:43.024185 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" podStartSLOduration=2.140001527 podStartE2EDuration="6.024172846s" podCreationTimestamp="2025-11-27 16:51:37 +0000 UTC" firstStartedPulling="2025-11-27 16:51:38.590051134 +0000 UTC m=+810.607491434" lastFinishedPulling="2025-11-27 16:51:42.474222413 +0000 UTC m=+814.491662753" observedRunningTime="2025-11-27 16:51:42.994987334 +0000 UTC m=+815.012427644" watchObservedRunningTime="2025-11-27 16:51:43.024172846 +0000 UTC m=+815.041613166" Nov 27 16:51:43 crc kubenswrapper[4954]: I1127 16:51:43.053764 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-96dn8" podStartSLOduration=2.153664446 podStartE2EDuration="6.053736837s" podCreationTimestamp="2025-11-27 16:51:37 +0000 UTC" firstStartedPulling="2025-11-27 16:51:38.540441367 +0000 UTC m=+810.557881667" lastFinishedPulling="2025-11-27 16:51:42.440513718 +0000 UTC m=+814.457954058" observedRunningTime="2025-11-27 16:51:43.049373709 +0000 UTC m=+815.066814019" watchObservedRunningTime="2025-11-27 16:51:43.053736837 +0000 UTC m=+815.071177137" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.319779 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-ghjsr" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.399428 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5zbp"] Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412366 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-controller" containerID="cri-o://edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412438 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="nbdb" containerID="cri-o://19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412707 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="northd" containerID="cri-o://87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412792 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412889 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-node" containerID="cri-o://625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.412975 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-acl-logging" containerID="cri-o://7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.413207 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="sbdb" containerID="cri-o://ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.458342 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" containerID="cri-o://c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" gracePeriod=30 Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.752839 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/3.log" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.756697 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovn-acl-logging/0.log" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.757440 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovn-controller/0.log" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.758252 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.835850 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-grg78"] Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836217 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-acl-logging" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836250 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-acl-logging" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836276 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836293 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836314 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836326 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836344 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="northd" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836358 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="northd" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836374 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836387 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836401 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836414 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836433 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836450 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836478 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-node" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836492 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-node" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836511 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="sbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836524 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="sbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836544 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836561 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836590 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kubecfg-setup" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836638 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kubecfg-setup" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.836667 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="nbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836685 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="nbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836945 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="nbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.836992 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837020 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837041 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="kube-rbac-proxy-node" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837065 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837096 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="sbdb" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837123 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="northd" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837143 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837166 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovn-acl-logging" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837192 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: E1127 16:51:48.837423 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837480 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.837815 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.838195 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerName="ovnkube-controller" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.841217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910471 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910531 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910567 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910613 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash" (OuterVolumeSpecName: "host-slash") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910721 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910739 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.910885 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911237 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911280 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911332 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911359 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911435 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27hxv\" (UniqueName: \"kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912342 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.911563 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912378 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912415 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912390 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912628 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket" (OuterVolumeSpecName: "log-socket") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912666 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912712 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912790 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912819 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912843 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912885 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.912920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes\") pod \"c9c365fc-0cba-4fcf-b721-30de2b908a56\" (UID: \"c9c365fc-0cba-4fcf-b721-30de2b908a56\") " Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-config\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-node-log\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913305 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-bin\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-systemd-units\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-script-lib\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913457 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-slash\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913490 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-netns\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913523 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-etc-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913621 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913677 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-systemd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913702 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-ovn\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913733 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-var-lib-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913757 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-log-socket\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913799 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvzb\" (UniqueName: \"kubernetes.io/projected/794ea4de-6414-40f1-a4b8-f0fa12396780-kube-api-access-2mvzb\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913835 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/794ea4de-6414-40f1-a4b8-f0fa12396780-ovn-node-metrics-cert\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913888 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-env-overrides\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913909 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-netd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913943 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.913971 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-kubelet\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914023 4954 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914038 4954 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914053 4954 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914065 4954 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-log-socket\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914078 4954 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-slash\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914090 4954 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914103 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914116 4954 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914128 4954 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914141 4954 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914563 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914693 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914707 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914753 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.914862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log" (OuterVolumeSpecName: "node-log") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.915179 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.919398 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.920014 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv" (OuterVolumeSpecName: "kube-api-access-27hxv") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "kube-api-access-27hxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:51:48 crc kubenswrapper[4954]: I1127 16:51:48.937054 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c9c365fc-0cba-4fcf-b721-30de2b908a56" (UID: "c9c365fc-0cba-4fcf-b721-30de2b908a56"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.014960 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-env-overrides\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-netd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015058 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015092 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-kubelet\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015201 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015265 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-kubelet\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015309 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-config\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015297 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-netd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-node-log\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015390 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-bin\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015455 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-node-log\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015489 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-cni-bin\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015506 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-systemd-units\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-systemd-units\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015727 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-env-overrides\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-script-lib\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-slash\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015823 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-netns\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015841 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-etc-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015859 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015865 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-slash\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015891 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-systemd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-etc-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-ovn\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015960 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015988 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-var-lib-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-var-lib-openvswitch\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-log-socket\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvzb\" (UniqueName: \"kubernetes.io/projected/794ea4de-6414-40f1-a4b8-f0fa12396780-kube-api-access-2mvzb\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016087 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/794ea4de-6414-40f1-a4b8-f0fa12396780-ovn-node-metrics-cert\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016106 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-log-socket\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.015997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-systemd\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016084 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-netns\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016155 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016297 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9c365fc-0cba-4fcf-b721-30de2b908a56-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016328 4954 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-run-ovn\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016348 4954 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-node-log\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/794ea4de-6414-40f1-a4b8-f0fa12396780-host-run-ovn-kubernetes\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016386 4954 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016410 4954 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016441 4954 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016456 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9c365fc-0cba-4fcf-b721-30de2b908a56-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016470 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27hxv\" (UniqueName: \"kubernetes.io/projected/c9c365fc-0cba-4fcf-b721-30de2b908a56-kube-api-access-27hxv\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016484 4954 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9c365fc-0cba-4fcf-b721-30de2b908a56-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.016875 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-config\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.017270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/794ea4de-6414-40f1-a4b8-f0fa12396780-ovnkube-script-lib\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.021662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/794ea4de-6414-40f1-a4b8-f0fa12396780-ovn-node-metrics-cert\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.035569 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovnkube-controller/3.log" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.038718 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovn-acl-logging/0.log" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.039344 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5zbp_c9c365fc-0cba-4fcf-b721-30de2b908a56/ovn-controller/0.log" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040013 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040075 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040093 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040116 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040136 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040152 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" exitCode=0 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040167 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" exitCode=143 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040185 4954 generic.go:334] "Generic (PLEG): container finished" podID="c9c365fc-0cba-4fcf-b721-30de2b908a56" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" exitCode=143 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040327 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040364 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040415 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040441 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040466 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040488 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040505 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040519 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040534 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040552 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040570 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040596 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040649 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040667 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040692 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040713 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040730 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040746 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040762 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040778 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040794 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040816 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040833 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040103 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040877 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.040850 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041039 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041053 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041060 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041066 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041072 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041078 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041084 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041090 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041095 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041102 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5zbp" event={"ID":"c9c365fc-0cba-4fcf-b721-30de2b908a56","Type":"ContainerDied","Data":"afa426086e9bfd2dbd7ad9acc345a36a4cbd56b5b0ee0a2397298f86ce0d7d69"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041121 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041129 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041134 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041140 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041146 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041154 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041160 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041166 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041172 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.041179 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.043105 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/2.log" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.043868 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/1.log" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.043914 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5bda3ef-ba2c-424a-ba4a-432053d1c40d" containerID="34f4a3bb92c39c5db5b427259524720518191fb6e9a74d427133a9d815df637d" exitCode=2 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.043938 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerDied","Data":"34f4a3bb92c39c5db5b427259524720518191fb6e9a74d427133a9d815df637d"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.043952 4954 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884"} Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.044558 4954 scope.go:117] "RemoveContainer" containerID="34f4a3bb92c39c5db5b427259524720518191fb6e9a74d427133a9d815df637d" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.050586 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvzb\" (UniqueName: \"kubernetes.io/projected/794ea4de-6414-40f1-a4b8-f0fa12396780-kube-api-access-2mvzb\") pod \"ovnkube-node-grg78\" (UID: \"794ea4de-6414-40f1-a4b8-f0fa12396780\") " pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.071341 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.131744 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5zbp"] Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.133710 4954 scope.go:117] "RemoveContainer" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.143758 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5zbp"] Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.167918 4954 scope.go:117] "RemoveContainer" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.174916 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.197924 4954 scope.go:117] "RemoveContainer" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: W1127 16:51:49.215986 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794ea4de_6414_40f1_a4b8_f0fa12396780.slice/crio-e4c89e4ee7e115578e40bc152958b6f8e9e6fea6758960049e5a01f91b8423b8 WatchSource:0}: Error finding container e4c89e4ee7e115578e40bc152958b6f8e9e6fea6758960049e5a01f91b8423b8: Status 404 returned error can't find the container with id e4c89e4ee7e115578e40bc152958b6f8e9e6fea6758960049e5a01f91b8423b8 Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.222699 4954 scope.go:117] "RemoveContainer" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.251366 4954 scope.go:117] "RemoveContainer" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.315454 4954 scope.go:117] "RemoveContainer" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.360016 4954 scope.go:117] "RemoveContainer" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.380943 4954 scope.go:117] "RemoveContainer" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.398702 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.399346 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.399422 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} err="failed to get container status \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.399469 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.399999 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": container with ID starting with 81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60 not found: ID does not exist" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.400039 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} err="failed to get container status \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": rpc error: code = NotFound desc = could not find container \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": container with ID starting with 81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.400066 4954 scope.go:117] "RemoveContainer" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.400442 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": container with ID starting with ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8 not found: ID does not exist" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.400496 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} err="failed to get container status \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": rpc error: code = NotFound desc = could not find container \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": container with ID starting with ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.400536 4954 scope.go:117] "RemoveContainer" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.401043 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": container with ID starting with 19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2 not found: ID does not exist" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.401116 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} err="failed to get container status \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": rpc error: code = NotFound desc = could not find container \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": container with ID starting with 19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.401164 4954 scope.go:117] "RemoveContainer" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.401668 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": container with ID starting with 87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5 not found: ID does not exist" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.401700 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} err="failed to get container status \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": rpc error: code = NotFound desc = could not find container \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": container with ID starting with 87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.401718 4954 scope.go:117] "RemoveContainer" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.403419 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": container with ID starting with f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5 not found: ID does not exist" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.403444 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} err="failed to get container status \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": rpc error: code = NotFound desc = could not find container \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": container with ID starting with f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.403459 4954 scope.go:117] "RemoveContainer" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.403815 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": container with ID starting with 625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b not found: ID does not exist" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.403853 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} err="failed to get container status \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": rpc error: code = NotFound desc = could not find container \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": container with ID starting with 625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.403879 4954 scope.go:117] "RemoveContainer" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.405949 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": container with ID starting with 7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7 not found: ID does not exist" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.406019 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} err="failed to get container status \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": rpc error: code = NotFound desc = could not find container \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": container with ID starting with 7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.406070 4954 scope.go:117] "RemoveContainer" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.407952 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": container with ID starting with edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3 not found: ID does not exist" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.408013 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} err="failed to get container status \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": rpc error: code = NotFound desc = could not find container \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": container with ID starting with edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.408057 4954 scope.go:117] "RemoveContainer" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: E1127 16:51:49.408449 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": container with ID starting with 7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e not found: ID does not exist" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.408497 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} err="failed to get container status \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": rpc error: code = NotFound desc = could not find container \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": container with ID starting with 7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.408525 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410134 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} err="failed to get container status \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410163 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410449 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} err="failed to get container status \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": rpc error: code = NotFound desc = could not find container \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": container with ID starting with 81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410476 4954 scope.go:117] "RemoveContainer" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410752 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} err="failed to get container status \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": rpc error: code = NotFound desc = could not find container \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": container with ID starting with ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.410775 4954 scope.go:117] "RemoveContainer" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411031 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} err="failed to get container status \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": rpc error: code = NotFound desc = could not find container \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": container with ID starting with 19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411049 4954 scope.go:117] "RemoveContainer" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411283 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} err="failed to get container status \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": rpc error: code = NotFound desc = could not find container \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": container with ID starting with 87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411305 4954 scope.go:117] "RemoveContainer" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411553 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} err="failed to get container status \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": rpc error: code = NotFound desc = could not find container \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": container with ID starting with f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411566 4954 scope.go:117] "RemoveContainer" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411749 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} err="failed to get container status \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": rpc error: code = NotFound desc = could not find container \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": container with ID starting with 625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411763 4954 scope.go:117] "RemoveContainer" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411953 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} err="failed to get container status \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": rpc error: code = NotFound desc = could not find container \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": container with ID starting with 7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.411969 4954 scope.go:117] "RemoveContainer" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.412412 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} err="failed to get container status \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": rpc error: code = NotFound desc = could not find container \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": container with ID starting with edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.412436 4954 scope.go:117] "RemoveContainer" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.412748 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} err="failed to get container status \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": rpc error: code = NotFound desc = could not find container \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": container with ID starting with 7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.412770 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413101 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} err="failed to get container status \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413122 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413417 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} err="failed to get container status \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": rpc error: code = NotFound desc = could not find container \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": container with ID starting with 81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413451 4954 scope.go:117] "RemoveContainer" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413840 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} err="failed to get container status \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": rpc error: code = NotFound desc = could not find container \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": container with ID starting with ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.413886 4954 scope.go:117] "RemoveContainer" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.414741 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} err="failed to get container status \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": rpc error: code = NotFound desc = could not find container \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": container with ID starting with 19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.414783 4954 scope.go:117] "RemoveContainer" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.415128 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} err="failed to get container status \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": rpc error: code = NotFound desc = could not find container \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": container with ID starting with 87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.415169 4954 scope.go:117] "RemoveContainer" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.415536 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} err="failed to get container status \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": rpc error: code = NotFound desc = could not find container \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": container with ID starting with f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.415575 4954 scope.go:117] "RemoveContainer" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.416189 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} err="failed to get container status \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": rpc error: code = NotFound desc = could not find container \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": container with ID starting with 625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.416227 4954 scope.go:117] "RemoveContainer" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.417185 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} err="failed to get container status \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": rpc error: code = NotFound desc = could not find container \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": container with ID starting with 7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.417230 4954 scope.go:117] "RemoveContainer" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418097 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} err="failed to get container status \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": rpc error: code = NotFound desc = could not find container \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": container with ID starting with edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418139 4954 scope.go:117] "RemoveContainer" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418483 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} err="failed to get container status \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": rpc error: code = NotFound desc = could not find container \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": container with ID starting with 7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418521 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418790 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} err="failed to get container status \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.418827 4954 scope.go:117] "RemoveContainer" containerID="81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419080 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60"} err="failed to get container status \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": rpc error: code = NotFound desc = could not find container \"81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60\": container with ID starting with 81bb34f2dce67efd76368e55b902d1cded4cf016e3f638b9c5acaf3f00ca2b60 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419117 4954 scope.go:117] "RemoveContainer" containerID="ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419343 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8"} err="failed to get container status \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": rpc error: code = NotFound desc = could not find container \"ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8\": container with ID starting with ffb814f23f93f625afae8c1e1ae42910e8b49b8318ca6ad89dcda5405b0aa4d8 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419376 4954 scope.go:117] "RemoveContainer" containerID="19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419632 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2"} err="failed to get container status \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": rpc error: code = NotFound desc = could not find container \"19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2\": container with ID starting with 19c2059add2d8eb7f8ceb70f39ed61fc41ab94e45726c245bdb33539b9c0bad2 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419666 4954 scope.go:117] "RemoveContainer" containerID="87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419940 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5"} err="failed to get container status \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": rpc error: code = NotFound desc = could not find container \"87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5\": container with ID starting with 87d2a186cc438dac69fb50bc1c179bfc59289d29ce7874c3d54923a1922a5af5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.419966 4954 scope.go:117] "RemoveContainer" containerID="f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.420198 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5"} err="failed to get container status \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": rpc error: code = NotFound desc = could not find container \"f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5\": container with ID starting with f01e7ab54a9f700ed214fad0501d4540e6b8ef5a22fae4f383e1fec2a79625b5 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.420236 4954 scope.go:117] "RemoveContainer" containerID="625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.420588 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b"} err="failed to get container status \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": rpc error: code = NotFound desc = could not find container \"625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b\": container with ID starting with 625fc1591b995ba0dcccbe1c584ac7eea3f60569f50dcb7ff429c9e7284c6b0b not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.420638 4954 scope.go:117] "RemoveContainer" containerID="7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.421449 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7"} err="failed to get container status \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": rpc error: code = NotFound desc = could not find container \"7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7\": container with ID starting with 7f88e138975daac19088256b0a10dbfe7e32dac4c055bca05385283ab8fb06b7 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.421506 4954 scope.go:117] "RemoveContainer" containerID="edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.421892 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3"} err="failed to get container status \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": rpc error: code = NotFound desc = could not find container \"edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3\": container with ID starting with edb42354e3064944e92d938e1834d92094faa5c36a7e35aef5761228dba17ce3 not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.421933 4954 scope.go:117] "RemoveContainer" containerID="7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.422191 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e"} err="failed to get container status \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": rpc error: code = NotFound desc = could not find container \"7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e\": container with ID starting with 7487b56c170f17879145b50feb45f41b7489a0de1c92f918c46035d2aa4e827e not found: ID does not exist" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.422224 4954 scope.go:117] "RemoveContainer" containerID="c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1" Nov 27 16:51:49 crc kubenswrapper[4954]: I1127 16:51:49.422470 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1"} err="failed to get container status \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": rpc error: code = NotFound desc = could not find container \"c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1\": container with ID starting with c247e7205296545100af4336d0c953a09c73efcd735a2e4e9d901ff312eb55a1 not found: ID does not exist" Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.057763 4954 generic.go:334] "Generic (PLEG): container finished" podID="794ea4de-6414-40f1-a4b8-f0fa12396780" containerID="a64a2e43ad30b22fed735018ab0bb3efd15255a00d3be5478851516b8be58a9f" exitCode=0 Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.057844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerDied","Data":"a64a2e43ad30b22fed735018ab0bb3efd15255a00d3be5478851516b8be58a9f"} Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.058671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"e4c89e4ee7e115578e40bc152958b6f8e9e6fea6758960049e5a01f91b8423b8"} Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.063205 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/2.log" Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.063894 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/1.log" Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.063957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9mb96" event={"ID":"c5bda3ef-ba2c-424a-ba4a-432053d1c40d","Type":"ContainerStarted","Data":"cd83992f37e88f0273740b8883906b6438916317bd146ea872e75f452375d585"} Nov 27 16:51:50 crc kubenswrapper[4954]: I1127 16:51:50.670743 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c365fc-0cba-4fcf-b721-30de2b908a56" path="/var/lib/kubelet/pods/c9c365fc-0cba-4fcf-b721-30de2b908a56/volumes" Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"d1ed2fa3da3edba91dcc820b92448813b0fd2a64bf8b8760c685e9a9296a51e0"} Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077794 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"ccce74baf5425fec38f538b07a67ab66fd77a52c04c9a20a573d8c3080336ff8"} Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"60068480b1a24113c7b1bc75038b7e5c20e99cb322dc3f4f3c599b7bbb148022"} Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"2addfb97b4d95abfdb113d2e1089c88855d184ed9b4c5181514473ae3ff19f01"} Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077865 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"e217d762b1d9b8d48746c92d0bae976f02535c599d7ce3a7c280aec2096bbff4"} Nov 27 16:51:51 crc kubenswrapper[4954]: I1127 16:51:51.077883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"caa4a9e59758e9b6298a9540ca6e6cef1503031d06303f8a9dd2d357577b7f82"} Nov 27 16:51:53 crc kubenswrapper[4954]: I1127 16:51:53.688260 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:51:53 crc kubenswrapper[4954]: I1127 16:51:53.688351 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:51:53 crc kubenswrapper[4954]: I1127 16:51:53.688420 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:51:53 crc kubenswrapper[4954]: I1127 16:51:53.689144 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:51:53 crc kubenswrapper[4954]: I1127 16:51:53.689218 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd" gracePeriod=600 Nov 27 16:51:54 crc kubenswrapper[4954]: I1127 16:51:54.108036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"11ca2be0a5681dfff60d6b6d2a184324ae5de22a160508531ecdaca6b87284e5"} Nov 27 16:51:54 crc kubenswrapper[4954]: I1127 16:51:54.114451 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd" exitCode=0 Nov 27 16:51:54 crc kubenswrapper[4954]: I1127 16:51:54.114507 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd"} Nov 27 16:51:54 crc kubenswrapper[4954]: I1127 16:51:54.114537 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b"} Nov 27 16:51:54 crc kubenswrapper[4954]: I1127 16:51:54.114566 4954 scope.go:117] "RemoveContainer" containerID="81c634e87acc9a6979f7829ea3fe2fc04e26098031020602e4626c4eb40e2aad" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.138699 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" event={"ID":"794ea4de-6414-40f1-a4b8-f0fa12396780","Type":"ContainerStarted","Data":"0c6a873bc10e25b826282d7e0e7cfe038b1fecaac8a5cb1a5591a1fc54e3fa58"} Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.139617 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.139636 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.139647 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.178422 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.179399 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:51:56 crc kubenswrapper[4954]: I1127 16:51:56.191822 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" podStartSLOduration=8.191790751 podStartE2EDuration="8.191790751s" podCreationTimestamp="2025-11-27 16:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:51:56.1836816 +0000 UTC m=+828.201121900" watchObservedRunningTime="2025-11-27 16:51:56.191790751 +0000 UTC m=+828.209231081" Nov 27 16:52:09 crc kubenswrapper[4954]: I1127 16:52:09.142937 4954 scope.go:117] "RemoveContainer" containerID="bcc3a6be3f2d6a2d8da09fab1320b33b7c36e0c403916e155274997bcb03c884" Nov 27 16:52:09 crc kubenswrapper[4954]: I1127 16:52:09.289282 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9mb96_c5bda3ef-ba2c-424a-ba4a-432053d1c40d/kube-multus/2.log" Nov 27 16:52:19 crc kubenswrapper[4954]: I1127 16:52:19.212860 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grg78" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.317030 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp"] Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.318989 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.321344 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.335100 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp"] Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.463724 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.463781 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.463845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prn8q\" (UniqueName: \"kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.565203 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.565275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.565345 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prn8q\" (UniqueName: \"kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.566669 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.566746 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.591506 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prn8q\" (UniqueName: \"kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.636727 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:29 crc kubenswrapper[4954]: I1127 16:52:29.911500 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp"] Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.390492 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.393755 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.405423 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.472294 4954 generic.go:334] "Generic (PLEG): container finished" podID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerID="81672a6957914fde4e5a86a6023306b0e48b99797754fa3db8b6cc99667a2cb9" exitCode=0 Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.472350 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" event={"ID":"bc86b0e3-7ca2-40a1-b559-e74733db90f0","Type":"ContainerDied","Data":"81672a6957914fde4e5a86a6023306b0e48b99797754fa3db8b6cc99667a2cb9"} Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.472382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" event={"ID":"bc86b0e3-7ca2-40a1-b559-e74733db90f0","Type":"ContainerStarted","Data":"4d886282301366c529b8a023f8cba60c3e9d7dc2d1b284f4e4cd37655b06f222"} Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.500196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.500277 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.500318 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wjv9\" (UniqueName: \"kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.601980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.602099 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.602181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wjv9\" (UniqueName: \"kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.602777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.602845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.642565 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wjv9\" (UniqueName: \"kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9\") pod \"redhat-operators-8lmnl\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:30 crc kubenswrapper[4954]: I1127 16:52:30.791991 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:31 crc kubenswrapper[4954]: I1127 16:52:31.072027 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:31 crc kubenswrapper[4954]: I1127 16:52:31.483291 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d3df697-c583-4edb-b446-47180f1c212b" containerID="cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf" exitCode=0 Nov 27 16:52:31 crc kubenswrapper[4954]: I1127 16:52:31.483408 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerDied","Data":"cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf"} Nov 27 16:52:31 crc kubenswrapper[4954]: I1127 16:52:31.483843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerStarted","Data":"7e49a445cc6d80b32a674f7d338fc229edc02b374a30bb9e8bbc40e60c0333a6"} Nov 27 16:52:32 crc kubenswrapper[4954]: I1127 16:52:32.496526 4954 generic.go:334] "Generic (PLEG): container finished" podID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerID="f9c3b66d776336462855d34ae093a13f150a65e88c7a135e3064487e83b9349e" exitCode=0 Nov 27 16:52:32 crc kubenswrapper[4954]: I1127 16:52:32.496700 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" event={"ID":"bc86b0e3-7ca2-40a1-b559-e74733db90f0","Type":"ContainerDied","Data":"f9c3b66d776336462855d34ae093a13f150a65e88c7a135e3064487e83b9349e"} Nov 27 16:52:33 crc kubenswrapper[4954]: I1127 16:52:33.513917 4954 generic.go:334] "Generic (PLEG): container finished" podID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerID="b81f4d1d83cd26a6b543e3012f786fa1de355a20bfef42401f2ea7b195909e4a" exitCode=0 Nov 27 16:52:33 crc kubenswrapper[4954]: I1127 16:52:33.514018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" event={"ID":"bc86b0e3-7ca2-40a1-b559-e74733db90f0","Type":"ContainerDied","Data":"b81f4d1d83cd26a6b543e3012f786fa1de355a20bfef42401f2ea7b195909e4a"} Nov 27 16:52:33 crc kubenswrapper[4954]: I1127 16:52:33.517626 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d3df697-c583-4edb-b446-47180f1c212b" containerID="b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4" exitCode=0 Nov 27 16:52:33 crc kubenswrapper[4954]: I1127 16:52:33.517664 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerDied","Data":"b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4"} Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.530525 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerStarted","Data":"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b"} Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.569890 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lmnl" podStartSLOduration=2.06353642 podStartE2EDuration="4.569860258s" podCreationTimestamp="2025-11-27 16:52:30 +0000 UTC" firstStartedPulling="2025-11-27 16:52:31.486108712 +0000 UTC m=+863.503549022" lastFinishedPulling="2025-11-27 16:52:33.99243256 +0000 UTC m=+866.009872860" observedRunningTime="2025-11-27 16:52:34.561414667 +0000 UTC m=+866.578854997" watchObservedRunningTime="2025-11-27 16:52:34.569860258 +0000 UTC m=+866.587300598" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.832454 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.869023 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prn8q\" (UniqueName: \"kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q\") pod \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.869189 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util\") pod \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.869217 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle\") pod \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\" (UID: \"bc86b0e3-7ca2-40a1-b559-e74733db90f0\") " Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.871179 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle" (OuterVolumeSpecName: "bundle") pod "bc86b0e3-7ca2-40a1-b559-e74733db90f0" (UID: "bc86b0e3-7ca2-40a1-b559-e74733db90f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.879184 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q" (OuterVolumeSpecName: "kube-api-access-prn8q") pod "bc86b0e3-7ca2-40a1-b559-e74733db90f0" (UID: "bc86b0e3-7ca2-40a1-b559-e74733db90f0"). InnerVolumeSpecName "kube-api-access-prn8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.886458 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util" (OuterVolumeSpecName: "util") pod "bc86b0e3-7ca2-40a1-b559-e74733db90f0" (UID: "bc86b0e3-7ca2-40a1-b559-e74733db90f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.971255 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.971294 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc86b0e3-7ca2-40a1-b559-e74733db90f0-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:34 crc kubenswrapper[4954]: I1127 16:52:34.971304 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prn8q\" (UniqueName: \"kubernetes.io/projected/bc86b0e3-7ca2-40a1-b559-e74733db90f0-kube-api-access-prn8q\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:35 crc kubenswrapper[4954]: I1127 16:52:35.540778 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" Nov 27 16:52:35 crc kubenswrapper[4954]: I1127 16:52:35.540792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp" event={"ID":"bc86b0e3-7ca2-40a1-b559-e74733db90f0","Type":"ContainerDied","Data":"4d886282301366c529b8a023f8cba60c3e9d7dc2d1b284f4e4cd37655b06f222"} Nov 27 16:52:35 crc kubenswrapper[4954]: I1127 16:52:35.541954 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d886282301366c529b8a023f8cba60c3e9d7dc2d1b284f4e4cd37655b06f222" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.900508 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx"] Nov 27 16:52:37 crc kubenswrapper[4954]: E1127 16:52:37.901249 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="pull" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.901264 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="pull" Nov 27 16:52:37 crc kubenswrapper[4954]: E1127 16:52:37.901277 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="extract" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.901283 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="extract" Nov 27 16:52:37 crc kubenswrapper[4954]: E1127 16:52:37.901300 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="util" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.901306 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="util" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.901449 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc86b0e3-7ca2-40a1-b559-e74733db90f0" containerName="extract" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.901933 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.904507 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.905022 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.905038 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-84c7k" Nov 27 16:52:37 crc kubenswrapper[4954]: I1127 16:52:37.922810 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx"] Nov 27 16:52:38 crc kubenswrapper[4954]: I1127 16:52:38.282853 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx8bl\" (UniqueName: \"kubernetes.io/projected/742e2266-3aa1-4c59-958e-8200fea0b45c-kube-api-access-kx8bl\") pod \"nmstate-operator-5b5b58f5c8-qx7sx\" (UID: \"742e2266-3aa1-4c59-958e-8200fea0b45c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" Nov 27 16:52:38 crc kubenswrapper[4954]: I1127 16:52:38.384393 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx8bl\" (UniqueName: \"kubernetes.io/projected/742e2266-3aa1-4c59-958e-8200fea0b45c-kube-api-access-kx8bl\") pod \"nmstate-operator-5b5b58f5c8-qx7sx\" (UID: \"742e2266-3aa1-4c59-958e-8200fea0b45c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" Nov 27 16:52:38 crc kubenswrapper[4954]: I1127 16:52:38.428424 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx8bl\" (UniqueName: \"kubernetes.io/projected/742e2266-3aa1-4c59-958e-8200fea0b45c-kube-api-access-kx8bl\") pod \"nmstate-operator-5b5b58f5c8-qx7sx\" (UID: \"742e2266-3aa1-4c59-958e-8200fea0b45c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" Nov 27 16:52:38 crc kubenswrapper[4954]: I1127 16:52:38.519771 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" Nov 27 16:52:38 crc kubenswrapper[4954]: I1127 16:52:38.784109 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx"] Nov 27 16:52:39 crc kubenswrapper[4954]: I1127 16:52:39.577690 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" event={"ID":"742e2266-3aa1-4c59-958e-8200fea0b45c","Type":"ContainerStarted","Data":"bc8396138273b99f34f055212fb8665471ae21d4e17ecd03890123e0092ff500"} Nov 27 16:52:40 crc kubenswrapper[4954]: I1127 16:52:40.793285 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:40 crc kubenswrapper[4954]: I1127 16:52:40.793938 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:41 crc kubenswrapper[4954]: I1127 16:52:41.607967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" event={"ID":"742e2266-3aa1-4c59-958e-8200fea0b45c","Type":"ContainerStarted","Data":"c9817df99b795b50843b55de9c857033cd3deee32ba6437cac36201ca994e78c"} Nov 27 16:52:41 crc kubenswrapper[4954]: I1127 16:52:41.649628 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qx7sx" podStartSLOduration=2.160383157 podStartE2EDuration="4.649599987s" podCreationTimestamp="2025-11-27 16:52:37 +0000 UTC" firstStartedPulling="2025-11-27 16:52:38.794567538 +0000 UTC m=+870.812007838" lastFinishedPulling="2025-11-27 16:52:41.283784368 +0000 UTC m=+873.301224668" observedRunningTime="2025-11-27 16:52:41.642961028 +0000 UTC m=+873.660401368" watchObservedRunningTime="2025-11-27 16:52:41.649599987 +0000 UTC m=+873.667040297" Nov 27 16:52:41 crc kubenswrapper[4954]: I1127 16:52:41.847474 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lmnl" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="registry-server" probeResult="failure" output=< Nov 27 16:52:41 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Nov 27 16:52:41 crc kubenswrapper[4954]: > Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.709013 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.710276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.712377 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-cvv2m" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.745298 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.751489 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.755799 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.758534 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-m4dwz"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.759485 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.779294 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.790375 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.847777 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbvr\" (UniqueName: \"kubernetes.io/projected/e9b96f60-bef6-430b-8f44-d5e602d140ee-kube-api-access-7dbvr\") pod \"nmstate-metrics-7f946cbc9-bkn4s\" (UID: \"e9b96f60-bef6-430b-8f44-d5e602d140ee\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.889092 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.889899 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.891827 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.892060 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-b44vv" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.892733 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.911659 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs"] Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.948933 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-ovs-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949002 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-dbus-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949029 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hm94\" (UniqueName: \"kubernetes.io/projected/391ad61e-fdf4-41bf-b3eb-a8950896debb-kube-api-access-5hm94\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949053 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/391ad61e-fdf4-41bf-b3eb-a8950896debb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbvr\" (UniqueName: \"kubernetes.io/projected/e9b96f60-bef6-430b-8f44-d5e602d140ee-kube-api-access-7dbvr\") pod \"nmstate-metrics-7f946cbc9-bkn4s\" (UID: \"e9b96f60-bef6-430b-8f44-d5e602d140ee\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949309 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-nmstate-lock\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.949427 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtkt\" (UniqueName: \"kubernetes.io/projected/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-kube-api-access-zvtkt\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:42 crc kubenswrapper[4954]: I1127 16:52:42.979144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbvr\" (UniqueName: \"kubernetes.io/projected/e9b96f60-bef6-430b-8f44-d5e602d140ee-kube-api-access-7dbvr\") pod \"nmstate-metrics-7f946cbc9-bkn4s\" (UID: \"e9b96f60-bef6-430b-8f44-d5e602d140ee\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.038652 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.050525 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfg4\" (UniqueName: \"kubernetes.io/projected/88437c38-051a-4331-bfd9-1b5356e88818-kube-api-access-zhfg4\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.050611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88437c38-051a-4331-bfd9-1b5356e88818-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.050680 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-nmstate-lock\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.050841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-nmstate-lock\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.050945 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtkt\" (UniqueName: \"kubernetes.io/projected/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-kube-api-access-zvtkt\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88437c38-051a-4331-bfd9-1b5356e88818-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051157 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-ovs-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051238 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-ovs-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-dbus-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051357 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hm94\" (UniqueName: \"kubernetes.io/projected/391ad61e-fdf4-41bf-b3eb-a8950896debb-kube-api-access-5hm94\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051404 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/391ad61e-fdf4-41bf-b3eb-a8950896debb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.051730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-dbus-socket\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.056961 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/391ad61e-fdf4-41bf-b3eb-a8950896debb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.076180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtkt\" (UniqueName: \"kubernetes.io/projected/80ecd4a6-6bf2-4533-ab69-a5a12b747d81-kube-api-access-zvtkt\") pod \"nmstate-handler-m4dwz\" (UID: \"80ecd4a6-6bf2-4533-ab69-a5a12b747d81\") " pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.096425 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hm94\" (UniqueName: \"kubernetes.io/projected/391ad61e-fdf4-41bf-b3eb-a8950896debb-kube-api-access-5hm94\") pod \"nmstate-webhook-5f6d4c5ccb-89lrn\" (UID: \"391ad61e-fdf4-41bf-b3eb-a8950896debb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.096891 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.099064 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c589d8dc4-s99r2"] Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.100669 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.122228 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c589d8dc4-s99r2"] Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.152555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfg4\" (UniqueName: \"kubernetes.io/projected/88437c38-051a-4331-bfd9-1b5356e88818-kube-api-access-zhfg4\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.152739 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88437c38-051a-4331-bfd9-1b5356e88818-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.153189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88437c38-051a-4331-bfd9-1b5356e88818-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.154011 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88437c38-051a-4331-bfd9-1b5356e88818-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.157677 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88437c38-051a-4331-bfd9-1b5356e88818-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.174808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfg4\" (UniqueName: \"kubernetes.io/projected/88437c38-051a-4331-bfd9-1b5356e88818-kube-api-access-zhfg4\") pod \"nmstate-console-plugin-7fbb5f6569-g5fhs\" (UID: \"88437c38-051a-4331-bfd9-1b5356e88818\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.203218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.256388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-trusted-ca-bundle\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.256467 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-oauth-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.256570 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-service-ca\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.256628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-oauth-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.257730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrmn\" (UniqueName: \"kubernetes.io/projected/eb4befa9-9398-4e32-835e-3e4d5b363d5a-kube-api-access-hwrmn\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.257831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.257970 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.317322 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s"] Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.360273 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.360362 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.360402 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-trusted-ca-bundle\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.360433 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-oauth-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.360470 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-service-ca\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.361716 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-oauth-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.361743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrmn\" (UniqueName: \"kubernetes.io/projected/eb4befa9-9398-4e32-835e-3e4d5b363d5a-kube-api-access-hwrmn\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.361534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.362317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-oauth-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.363057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-service-ca\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.364076 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4befa9-9398-4e32-835e-3e4d5b363d5a-trusted-ca-bundle\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.365256 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-serving-cert\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.365552 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb4befa9-9398-4e32-835e-3e4d5b363d5a-console-oauth-config\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.380306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrmn\" (UniqueName: \"kubernetes.io/projected/eb4befa9-9398-4e32-835e-3e4d5b363d5a-kube-api-access-hwrmn\") pod \"console-7c589d8dc4-s99r2\" (UID: \"eb4befa9-9398-4e32-835e-3e4d5b363d5a\") " pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.381345 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.440727 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.483197 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs"] Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.615891 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn"] Nov 27 16:52:43 crc kubenswrapper[4954]: W1127 16:52:43.621631 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod391ad61e_fdf4_41bf_b3eb_a8950896debb.slice/crio-6282b150d725afff059cae1b9bb4f87ec2442fc5c418a5631f7b0d8a37287ce1 WatchSource:0}: Error finding container 6282b150d725afff059cae1b9bb4f87ec2442fc5c418a5631f7b0d8a37287ce1: Status 404 returned error can't find the container with id 6282b150d725afff059cae1b9bb4f87ec2442fc5c418a5631f7b0d8a37287ce1 Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.623502 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" event={"ID":"88437c38-051a-4331-bfd9-1b5356e88818","Type":"ContainerStarted","Data":"a9d8067839a4d8fc8bc25f4f383470ca085ec91e39e879eab74c646b362eaf44"} Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.625991 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" event={"ID":"e9b96f60-bef6-430b-8f44-d5e602d140ee","Type":"ContainerStarted","Data":"51eb12690eacb08311490df38a3713258b4ffa9785f4fc9f0824662735ad30b0"} Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.628069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m4dwz" event={"ID":"80ecd4a6-6bf2-4533-ab69-a5a12b747d81","Type":"ContainerStarted","Data":"808110b849e52f2b3528df988bc027043ae0c5dee347cf9a72fd541216c47792"} Nov 27 16:52:43 crc kubenswrapper[4954]: I1127 16:52:43.736153 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c589d8dc4-s99r2"] Nov 27 16:52:43 crc kubenswrapper[4954]: W1127 16:52:43.740750 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4befa9_9398_4e32_835e_3e4d5b363d5a.slice/crio-c038c73cfab11a70fda98d42a90526e20bd7e8734a1a54257885c750a576ef37 WatchSource:0}: Error finding container c038c73cfab11a70fda98d42a90526e20bd7e8734a1a54257885c750a576ef37: Status 404 returned error can't find the container with id c038c73cfab11a70fda98d42a90526e20bd7e8734a1a54257885c750a576ef37 Nov 27 16:52:44 crc kubenswrapper[4954]: I1127 16:52:44.639146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c589d8dc4-s99r2" event={"ID":"eb4befa9-9398-4e32-835e-3e4d5b363d5a","Type":"ContainerStarted","Data":"b3034070603d0a817247ca7a2e6c9bba93d05f0968df81244f171f6f5729361d"} Nov 27 16:52:44 crc kubenswrapper[4954]: I1127 16:52:44.639729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c589d8dc4-s99r2" event={"ID":"eb4befa9-9398-4e32-835e-3e4d5b363d5a","Type":"ContainerStarted","Data":"c038c73cfab11a70fda98d42a90526e20bd7e8734a1a54257885c750a576ef37"} Nov 27 16:52:44 crc kubenswrapper[4954]: I1127 16:52:44.641649 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" event={"ID":"391ad61e-fdf4-41bf-b3eb-a8950896debb","Type":"ContainerStarted","Data":"6282b150d725afff059cae1b9bb4f87ec2442fc5c418a5631f7b0d8a37287ce1"} Nov 27 16:52:44 crc kubenswrapper[4954]: I1127 16:52:44.672700 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c589d8dc4-s99r2" podStartSLOduration=1.672671703 podStartE2EDuration="1.672671703s" podCreationTimestamp="2025-11-27 16:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:52:44.670802809 +0000 UTC m=+876.688243149" watchObservedRunningTime="2025-11-27 16:52:44.672671703 +0000 UTC m=+876.690112043" Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.676673 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m4dwz" event={"ID":"80ecd4a6-6bf2-4533-ab69-a5a12b747d81","Type":"ContainerStarted","Data":"ba29f07f065603c6543cffefc12939c8eec07bb90efdc9a68eb560ef47b8e526"} Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.677619 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.681933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" event={"ID":"391ad61e-fdf4-41bf-b3eb-a8950896debb","Type":"ContainerStarted","Data":"4671f94884d71de51df165a07716765b37086e6f08ec5d84faca3d3405749b84"} Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.682230 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.685409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" event={"ID":"88437c38-051a-4331-bfd9-1b5356e88818","Type":"ContainerStarted","Data":"076690b39a873644f84dacc9de777d071c9876d4ce7659e827b57572f095f481"} Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.691751 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" event={"ID":"e9b96f60-bef6-430b-8f44-d5e602d140ee","Type":"ContainerStarted","Data":"450e114b5b0606ecbaf87281001d8151d017e963ae9da62b5694e695207af4db"} Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.712179 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-m4dwz" podStartSLOduration=2.165366202 podStartE2EDuration="5.712142644s" podCreationTimestamp="2025-11-27 16:52:42 +0000 UTC" firstStartedPulling="2025-11-27 16:52:43.153519993 +0000 UTC m=+875.170960293" lastFinishedPulling="2025-11-27 16:52:46.700296395 +0000 UTC m=+878.717736735" observedRunningTime="2025-11-27 16:52:47.701169376 +0000 UTC m=+879.718609676" watchObservedRunningTime="2025-11-27 16:52:47.712142644 +0000 UTC m=+879.729582974" Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.734415 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-g5fhs" podStartSLOduration=2.590345598 podStartE2EDuration="5.73438923s" podCreationTimestamp="2025-11-27 16:52:42 +0000 UTC" firstStartedPulling="2025-11-27 16:52:43.509485383 +0000 UTC m=+875.526925683" lastFinishedPulling="2025-11-27 16:52:46.653528975 +0000 UTC m=+878.670969315" observedRunningTime="2025-11-27 16:52:47.724542771 +0000 UTC m=+879.741983111" watchObservedRunningTime="2025-11-27 16:52:47.73438923 +0000 UTC m=+879.751829530" Nov 27 16:52:47 crc kubenswrapper[4954]: I1127 16:52:47.760130 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" podStartSLOduration=2.720025154 podStartE2EDuration="5.760106566s" podCreationTimestamp="2025-11-27 16:52:42 +0000 UTC" firstStartedPulling="2025-11-27 16:52:43.624296567 +0000 UTC m=+875.641736867" lastFinishedPulling="2025-11-27 16:52:46.664377979 +0000 UTC m=+878.681818279" observedRunningTime="2025-11-27 16:52:47.747945668 +0000 UTC m=+879.765386028" watchObservedRunningTime="2025-11-27 16:52:47.760106566 +0000 UTC m=+879.777546866" Nov 27 16:52:49 crc kubenswrapper[4954]: I1127 16:52:49.712013 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" event={"ID":"e9b96f60-bef6-430b-8f44-d5e602d140ee","Type":"ContainerStarted","Data":"0b5b861a3ae6378a3c4e676fbf12cf54d8b2d37a9b036ad5bbfe3e1ee67780cc"} Nov 27 16:52:49 crc kubenswrapper[4954]: I1127 16:52:49.751607 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bkn4s" podStartSLOduration=1.557287447 podStartE2EDuration="7.751550584s" podCreationTimestamp="2025-11-27 16:52:42 +0000 UTC" firstStartedPulling="2025-11-27 16:52:43.32676625 +0000 UTC m=+875.344206550" lastFinishedPulling="2025-11-27 16:52:49.521029377 +0000 UTC m=+881.538469687" observedRunningTime="2025-11-27 16:52:49.734879848 +0000 UTC m=+881.752320228" watchObservedRunningTime="2025-11-27 16:52:49.751550584 +0000 UTC m=+881.768990924" Nov 27 16:52:50 crc kubenswrapper[4954]: I1127 16:52:50.877762 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:50 crc kubenswrapper[4954]: I1127 16:52:50.945556 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:51 crc kubenswrapper[4954]: I1127 16:52:51.127507 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:52 crc kubenswrapper[4954]: I1127 16:52:52.737432 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lmnl" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="registry-server" containerID="cri-o://46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b" gracePeriod=2 Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.137653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-m4dwz" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.219273 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.366407 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wjv9\" (UniqueName: \"kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9\") pod \"0d3df697-c583-4edb-b446-47180f1c212b\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.366481 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities\") pod \"0d3df697-c583-4edb-b446-47180f1c212b\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.366510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content\") pod \"0d3df697-c583-4edb-b446-47180f1c212b\" (UID: \"0d3df697-c583-4edb-b446-47180f1c212b\") " Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.368049 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities" (OuterVolumeSpecName: "utilities") pod "0d3df697-c583-4edb-b446-47180f1c212b" (UID: "0d3df697-c583-4edb-b446-47180f1c212b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.372372 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9" (OuterVolumeSpecName: "kube-api-access-4wjv9") pod "0d3df697-c583-4edb-b446-47180f1c212b" (UID: "0d3df697-c583-4edb-b446-47180f1c212b"). InnerVolumeSpecName "kube-api-access-4wjv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.441414 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.441477 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.446673 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.461528 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d3df697-c583-4edb-b446-47180f1c212b" (UID: "0d3df697-c583-4edb-b446-47180f1c212b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.468261 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wjv9\" (UniqueName: \"kubernetes.io/projected/0d3df697-c583-4edb-b446-47180f1c212b-kube-api-access-4wjv9\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.468295 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.468306 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3df697-c583-4edb-b446-47180f1c212b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.748349 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d3df697-c583-4edb-b446-47180f1c212b" containerID="46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b" exitCode=0 Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.748467 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerDied","Data":"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b"} Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.748490 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lmnl" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.748545 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lmnl" event={"ID":"0d3df697-c583-4edb-b446-47180f1c212b","Type":"ContainerDied","Data":"7e49a445cc6d80b32a674f7d338fc229edc02b374a30bb9e8bbc40e60c0333a6"} Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.748613 4954 scope.go:117] "RemoveContainer" containerID="46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.753407 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c589d8dc4-s99r2" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.776823 4954 scope.go:117] "RemoveContainer" containerID="b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.832519 4954 scope.go:117] "RemoveContainer" containerID="cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.843360 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.851121 4954 scope.go:117] "RemoveContainer" containerID="46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b" Nov 27 16:52:53 crc kubenswrapper[4954]: E1127 16:52:53.851594 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b\": container with ID starting with 46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b not found: ID does not exist" containerID="46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.851651 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b"} err="failed to get container status \"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b\": rpc error: code = NotFound desc = could not find container \"46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b\": container with ID starting with 46cf5964e1f42921c762f77fb7c8547515b877a2f11846199759a75b06dec77b not found: ID does not exist" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.851684 4954 scope.go:117] "RemoveContainer" containerID="b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4" Nov 27 16:52:53 crc kubenswrapper[4954]: E1127 16:52:53.851942 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4\": container with ID starting with b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4 not found: ID does not exist" containerID="b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.851972 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4"} err="failed to get container status \"b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4\": rpc error: code = NotFound desc = could not find container \"b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4\": container with ID starting with b91e90c850c36da0ac594f7777d65702f3581e645bbe49f252a4a450a76822a4 not found: ID does not exist" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.851990 4954 scope.go:117] "RemoveContainer" containerID="cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf" Nov 27 16:52:53 crc kubenswrapper[4954]: E1127 16:52:53.852365 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf\": container with ID starting with cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf not found: ID does not exist" containerID="cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.852420 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf"} err="failed to get container status \"cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf\": rpc error: code = NotFound desc = could not find container \"cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf\": container with ID starting with cc04ef32b2c654e8a10b274c2925b5742def5ef3a82dcb8bc2d1519c92c119cf not found: ID does not exist" Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.857391 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lmnl"] Nov 27 16:52:53 crc kubenswrapper[4954]: I1127 16:52:53.861504 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:52:54 crc kubenswrapper[4954]: I1127 16:52:54.678933 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3df697-c583-4edb-b446-47180f1c212b" path="/var/lib/kubelet/pods/0d3df697-c583-4edb-b446-47180f1c212b/volumes" Nov 27 16:53:03 crc kubenswrapper[4954]: I1127 16:53:03.392049 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-89lrn" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.535826 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj"] Nov 27 16:53:16 crc kubenswrapper[4954]: E1127 16:53:16.536539 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="extract-utilities" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.536552 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="extract-utilities" Nov 27 16:53:16 crc kubenswrapper[4954]: E1127 16:53:16.536569 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="registry-server" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.536588 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="registry-server" Nov 27 16:53:16 crc kubenswrapper[4954]: E1127 16:53:16.536602 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="extract-content" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.536608 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="extract-content" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.536715 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3df697-c583-4edb-b446-47180f1c212b" containerName="registry-server" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.537507 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.539932 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.550973 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj"] Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.643362 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.643418 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6fz\" (UniqueName: \"kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.643507 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.745280 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.745388 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.745418 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6fz\" (UniqueName: \"kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.746161 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.746226 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.767445 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6fz\" (UniqueName: \"kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:16 crc kubenswrapper[4954]: I1127 16:53:16.855695 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:17 crc kubenswrapper[4954]: I1127 16:53:17.136498 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj"] Nov 27 16:53:17 crc kubenswrapper[4954]: I1127 16:53:17.941985 4954 generic.go:334] "Generic (PLEG): container finished" podID="76711bd9-a588-4492-9d26-0d80376444db" containerID="6f2cefe8d1bb13fffcd79f2cde78f68042055e7c562075778f5c290f20911197" exitCode=0 Nov 27 16:53:17 crc kubenswrapper[4954]: I1127 16:53:17.942206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" event={"ID":"76711bd9-a588-4492-9d26-0d80376444db","Type":"ContainerDied","Data":"6f2cefe8d1bb13fffcd79f2cde78f68042055e7c562075778f5c290f20911197"} Nov 27 16:53:17 crc kubenswrapper[4954]: I1127 16:53:17.942719 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" event={"ID":"76711bd9-a588-4492-9d26-0d80376444db","Type":"ContainerStarted","Data":"1ae451fc1c9164089afae86902976ff4007a3382bffc127e0bccbc303d395a67"} Nov 27 16:53:18 crc kubenswrapper[4954]: I1127 16:53:18.931918 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-s8cm2" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" containerID="cri-o://08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27" gracePeriod=15 Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.383707 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-s8cm2_7a3c2a78-4ced-43d5-a3b7-25637f36d2fc/console/0.log" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.384050 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.493700 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.493782 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.495025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.495642 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99ggq\" (UniqueName: \"kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.496278 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.496442 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.494941 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config" (OuterVolumeSpecName: "console-config") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.495512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.496649 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle\") pod \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\" (UID: \"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc\") " Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.497100 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.497289 4954 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.497344 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.497358 4954 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.497840 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.501895 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.502174 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.513714 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq" (OuterVolumeSpecName: "kube-api-access-99ggq") pod "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" (UID: "7a3c2a78-4ced-43d5-a3b7-25637f36d2fc"). InnerVolumeSpecName "kube-api-access-99ggq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.598810 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99ggq\" (UniqueName: \"kubernetes.io/projected/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-kube-api-access-99ggq\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.598854 4954 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.598863 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.598871 4954 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960104 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-s8cm2_7a3c2a78-4ced-43d5-a3b7-25637f36d2fc/console/0.log" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960168 4954 generic.go:334] "Generic (PLEG): container finished" podID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerID="08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27" exitCode=2 Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960202 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s8cm2" event={"ID":"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc","Type":"ContainerDied","Data":"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27"} Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960234 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s8cm2" event={"ID":"7a3c2a78-4ced-43d5-a3b7-25637f36d2fc","Type":"ContainerDied","Data":"058d075931949c57176ad3d1cd47955163a6467499a7b25bf5dd0cbcbb7e056f"} Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960250 4954 scope.go:117] "RemoveContainer" containerID="08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27" Nov 27 16:53:19 crc kubenswrapper[4954]: I1127 16:53:19.960288 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s8cm2" Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.008063 4954 scope.go:117] "RemoveContainer" containerID="08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27" Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.008706 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:53:20 crc kubenswrapper[4954]: E1127 16:53:20.009116 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27\": container with ID starting with 08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27 not found: ID does not exist" containerID="08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27" Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.009189 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27"} err="failed to get container status \"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27\": rpc error: code = NotFound desc = could not find container \"08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27\": container with ID starting with 08231d0ba782fcfb8bc3cf9c5180fb9b6e01d8ca06b7c651482abbafc4f67f27 not found: ID does not exist" Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.016107 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-s8cm2"] Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.676158 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" path="/var/lib/kubelet/pods/7a3c2a78-4ced-43d5-a3b7-25637f36d2fc/volumes" Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.970193 4954 generic.go:334] "Generic (PLEG): container finished" podID="76711bd9-a588-4492-9d26-0d80376444db" containerID="5246b3464e363ad369e3cf2891e50dc27658d2401b7dbc4d004bccda23a36ee3" exitCode=0 Nov 27 16:53:20 crc kubenswrapper[4954]: I1127 16:53:20.970250 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" event={"ID":"76711bd9-a588-4492-9d26-0d80376444db","Type":"ContainerDied","Data":"5246b3464e363ad369e3cf2891e50dc27658d2401b7dbc4d004bccda23a36ee3"} Nov 27 16:53:21 crc kubenswrapper[4954]: I1127 16:53:21.981507 4954 generic.go:334] "Generic (PLEG): container finished" podID="76711bd9-a588-4492-9d26-0d80376444db" containerID="d83dcfe4e47f4faaaba419ff6d652266f5079d5301d21d7cb8a6b5199d700bc9" exitCode=0 Nov 27 16:53:21 crc kubenswrapper[4954]: I1127 16:53:21.981561 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" event={"ID":"76711bd9-a588-4492-9d26-0d80376444db","Type":"ContainerDied","Data":"d83dcfe4e47f4faaaba419ff6d652266f5079d5301d21d7cb8a6b5199d700bc9"} Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.024431 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:22 crc kubenswrapper[4954]: E1127 16:53:22.025421 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.025458 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.025680 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3c2a78-4ced-43d5-a3b7-25637f36d2fc" containerName="console" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.027124 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.035707 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.036572 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9cc\" (UniqueName: \"kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.036808 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.037041 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.138062 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9cc\" (UniqueName: \"kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.138155 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.138232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.138837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.138901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.156765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9cc\" (UniqueName: \"kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc\") pod \"certified-operators-jqqwd\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.390934 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.625983 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.989375 4954 generic.go:334] "Generic (PLEG): container finished" podID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerID="a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6" exitCode=0 Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.989473 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerDied","Data":"a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6"} Nov 27 16:53:22 crc kubenswrapper[4954]: I1127 16:53:22.989523 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerStarted","Data":"a52a5089f5aebdcc9d2d3a76f4c60be3217714cdb15099ddff2d7e4da3087f59"} Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.294427 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.457452 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg6fz\" (UniqueName: \"kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz\") pod \"76711bd9-a588-4492-9d26-0d80376444db\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.457718 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util\") pod \"76711bd9-a588-4492-9d26-0d80376444db\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.457800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle\") pod \"76711bd9-a588-4492-9d26-0d80376444db\" (UID: \"76711bd9-a588-4492-9d26-0d80376444db\") " Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.458686 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle" (OuterVolumeSpecName: "bundle") pod "76711bd9-a588-4492-9d26-0d80376444db" (UID: "76711bd9-a588-4492-9d26-0d80376444db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.467758 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util" (OuterVolumeSpecName: "util") pod "76711bd9-a588-4492-9d26-0d80376444db" (UID: "76711bd9-a588-4492-9d26-0d80376444db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.468438 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz" (OuterVolumeSpecName: "kube-api-access-wg6fz") pod "76711bd9-a588-4492-9d26-0d80376444db" (UID: "76711bd9-a588-4492-9d26-0d80376444db"). InnerVolumeSpecName "kube-api-access-wg6fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.559154 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg6fz\" (UniqueName: \"kubernetes.io/projected/76711bd9-a588-4492-9d26-0d80376444db-kube-api-access-wg6fz\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.559194 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.559206 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76711bd9-a588-4492-9d26-0d80376444db-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:23 crc kubenswrapper[4954]: I1127 16:53:23.998766 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerStarted","Data":"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4"} Nov 27 16:53:24 crc kubenswrapper[4954]: I1127 16:53:24.003722 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" event={"ID":"76711bd9-a588-4492-9d26-0d80376444db","Type":"ContainerDied","Data":"1ae451fc1c9164089afae86902976ff4007a3382bffc127e0bccbc303d395a67"} Nov 27 16:53:24 crc kubenswrapper[4954]: I1127 16:53:24.003757 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ae451fc1c9164089afae86902976ff4007a3382bffc127e0bccbc303d395a67" Nov 27 16:53:24 crc kubenswrapper[4954]: I1127 16:53:24.004039 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj" Nov 27 16:53:25 crc kubenswrapper[4954]: I1127 16:53:25.013842 4954 generic.go:334] "Generic (PLEG): container finished" podID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerID="7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4" exitCode=0 Nov 27 16:53:25 crc kubenswrapper[4954]: I1127 16:53:25.013926 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerDied","Data":"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4"} Nov 27 16:53:26 crc kubenswrapper[4954]: I1127 16:53:26.025330 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerStarted","Data":"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc"} Nov 27 16:53:26 crc kubenswrapper[4954]: I1127 16:53:26.051104 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jqqwd" podStartSLOduration=2.528008379 podStartE2EDuration="5.051080432s" podCreationTimestamp="2025-11-27 16:53:21 +0000 UTC" firstStartedPulling="2025-11-27 16:53:22.992428821 +0000 UTC m=+915.009869121" lastFinishedPulling="2025-11-27 16:53:25.515500834 +0000 UTC m=+917.532941174" observedRunningTime="2025-11-27 16:53:26.049839242 +0000 UTC m=+918.067279552" watchObservedRunningTime="2025-11-27 16:53:26.051080432 +0000 UTC m=+918.068520742" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.021089 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:28 crc kubenswrapper[4954]: E1127 16:53:28.021660 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="extract" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.021672 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="extract" Nov 27 16:53:28 crc kubenswrapper[4954]: E1127 16:53:28.021681 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="pull" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.021687 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="pull" Nov 27 16:53:28 crc kubenswrapper[4954]: E1127 16:53:28.021700 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="util" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.021706 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="util" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.021809 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="76711bd9-a588-4492-9d26-0d80376444db" containerName="extract" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.022572 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.035054 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.126937 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.127005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bz78\" (UniqueName: \"kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.127166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.228613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.228683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bz78\" (UniqueName: \"kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.228728 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.229371 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.229507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.259066 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bz78\" (UniqueName: \"kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78\") pod \"redhat-marketplace-7gd68\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.339227 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:28 crc kubenswrapper[4954]: I1127 16:53:28.773282 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:29 crc kubenswrapper[4954]: I1127 16:53:29.048170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerStarted","Data":"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14"} Nov 27 16:53:29 crc kubenswrapper[4954]: I1127 16:53:29.048515 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerStarted","Data":"09dab455e35cfcd42091bc8ab70f4289551e3b4ac7df7a30733506246b9775bc"} Nov 27 16:53:30 crc kubenswrapper[4954]: I1127 16:53:30.055053 4954 generic.go:334] "Generic (PLEG): container finished" podID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerID="96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14" exitCode=0 Nov 27 16:53:30 crc kubenswrapper[4954]: I1127 16:53:30.055097 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerDied","Data":"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14"} Nov 27 16:53:31 crc kubenswrapper[4954]: I1127 16:53:31.062399 4954 generic.go:334] "Generic (PLEG): container finished" podID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerID="322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636" exitCode=0 Nov 27 16:53:31 crc kubenswrapper[4954]: I1127 16:53:31.062501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerDied","Data":"322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636"} Nov 27 16:53:32 crc kubenswrapper[4954]: I1127 16:53:32.071019 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerStarted","Data":"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53"} Nov 27 16:53:32 crc kubenswrapper[4954]: I1127 16:53:32.091411 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7gd68" podStartSLOduration=3.458417593 podStartE2EDuration="5.091393999s" podCreationTimestamp="2025-11-27 16:53:27 +0000 UTC" firstStartedPulling="2025-11-27 16:53:30.05645434 +0000 UTC m=+922.073894640" lastFinishedPulling="2025-11-27 16:53:31.689430726 +0000 UTC m=+923.706871046" observedRunningTime="2025-11-27 16:53:32.090087197 +0000 UTC m=+924.107527497" watchObservedRunningTime="2025-11-27 16:53:32.091393999 +0000 UTC m=+924.108834299" Nov 27 16:53:32 crc kubenswrapper[4954]: I1127 16:53:32.391443 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:32 crc kubenswrapper[4954]: I1127 16:53:32.391507 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:32 crc kubenswrapper[4954]: I1127 16:53:32.446942 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.066864 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv"] Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.067683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.070093 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.070453 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.072175 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lfb46" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.072282 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.072325 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.102946 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv"] Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.134590 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.193901 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-webhook-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.193985 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-apiservice-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.194018 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2lj\" (UniqueName: \"kubernetes.io/projected/52ee24fe-968b-440d-8884-5772e253c8b4-kube-api-access-rg2lj\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.296468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-apiservice-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.296551 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2lj\" (UniqueName: \"kubernetes.io/projected/52ee24fe-968b-440d-8884-5772e253c8b4-kube-api-access-rg2lj\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.296630 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-webhook-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.302710 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-apiservice-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.313944 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52ee24fe-968b-440d-8884-5772e253c8b4-webhook-cert\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.322320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2lj\" (UniqueName: \"kubernetes.io/projected/52ee24fe-968b-440d-8884-5772e253c8b4-kube-api-access-rg2lj\") pod \"metallb-operator-controller-manager-5d9ff7464f-4f4jv\" (UID: \"52ee24fe-968b-440d-8884-5772e253c8b4\") " pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.388586 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.400377 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v"] Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.401801 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.405470 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.408706 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.408913 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pn6wf" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.426648 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v"] Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.503297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkm9l\" (UniqueName: \"kubernetes.io/projected/0b1812ac-de14-42bf-acbf-d6a68650bb93-kube-api-access-zkm9l\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.503355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-webhook-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.503403 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-apiservice-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.604552 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkm9l\" (UniqueName: \"kubernetes.io/projected/0b1812ac-de14-42bf-acbf-d6a68650bb93-kube-api-access-zkm9l\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.604612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-webhook-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.604660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-apiservice-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.613511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-webhook-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.613782 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b1812ac-de14-42bf-acbf-d6a68650bb93-apiservice-cert\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.652397 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkm9l\" (UniqueName: \"kubernetes.io/projected/0b1812ac-de14-42bf-acbf-d6a68650bb93-kube-api-access-zkm9l\") pod \"metallb-operator-webhook-server-6c64f4dc9b-sq87v\" (UID: \"0b1812ac-de14-42bf-acbf-d6a68650bb93\") " pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.727185 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv"] Nov 27 16:53:33 crc kubenswrapper[4954]: W1127 16:53:33.733769 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ee24fe_968b_440d_8884_5772e253c8b4.slice/crio-0aa272bd145d6418d799eca6e38eeb29af1666078fea595faf821d31ce49181d WatchSource:0}: Error finding container 0aa272bd145d6418d799eca6e38eeb29af1666078fea595faf821d31ce49181d: Status 404 returned error can't find the container with id 0aa272bd145d6418d799eca6e38eeb29af1666078fea595faf821d31ce49181d Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.743996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:33 crc kubenswrapper[4954]: I1127 16:53:33.968725 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v"] Nov 27 16:53:33 crc kubenswrapper[4954]: W1127 16:53:33.974201 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b1812ac_de14_42bf_acbf_d6a68650bb93.slice/crio-eb0ce8af432ee8ef582fd9824fc6c75beec7cd8a84b26961b50b5599297d319c WatchSource:0}: Error finding container eb0ce8af432ee8ef582fd9824fc6c75beec7cd8a84b26961b50b5599297d319c: Status 404 returned error can't find the container with id eb0ce8af432ee8ef582fd9824fc6c75beec7cd8a84b26961b50b5599297d319c Nov 27 16:53:34 crc kubenswrapper[4954]: I1127 16:53:34.104411 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" event={"ID":"0b1812ac-de14-42bf-acbf-d6a68650bb93","Type":"ContainerStarted","Data":"eb0ce8af432ee8ef582fd9824fc6c75beec7cd8a84b26961b50b5599297d319c"} Nov 27 16:53:34 crc kubenswrapper[4954]: I1127 16:53:34.109546 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" event={"ID":"52ee24fe-968b-440d-8884-5772e253c8b4","Type":"ContainerStarted","Data":"0aa272bd145d6418d799eca6e38eeb29af1666078fea595faf821d31ce49181d"} Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.010193 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.012974 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jqqwd" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="registry-server" containerID="cri-o://a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc" gracePeriod=2 Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.616821 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.657341 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities\") pod \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.657402 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content\") pod \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.657486 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9cc\" (UniqueName: \"kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc\") pod \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\" (UID: \"bd0710b7-3b77-4f5b-b372-5e89efa2c33f\") " Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.658196 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities" (OuterVolumeSpecName: "utilities") pod "bd0710b7-3b77-4f5b-b372-5e89efa2c33f" (UID: "bd0710b7-3b77-4f5b-b372-5e89efa2c33f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.667162 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc" (OuterVolumeSpecName: "kube-api-access-4p9cc") pod "bd0710b7-3b77-4f5b-b372-5e89efa2c33f" (UID: "bd0710b7-3b77-4f5b-b372-5e89efa2c33f"). InnerVolumeSpecName "kube-api-access-4p9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.730093 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd0710b7-3b77-4f5b-b372-5e89efa2c33f" (UID: "bd0710b7-3b77-4f5b-b372-5e89efa2c33f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.759981 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.760467 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:36 crc kubenswrapper[4954]: I1127 16:53:36.760480 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9cc\" (UniqueName: \"kubernetes.io/projected/bd0710b7-3b77-4f5b-b372-5e89efa2c33f-kube-api-access-4p9cc\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.130297 4954 generic.go:334] "Generic (PLEG): container finished" podID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerID="a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc" exitCode=0 Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.130341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerDied","Data":"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc"} Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.130372 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqqwd" event={"ID":"bd0710b7-3b77-4f5b-b372-5e89efa2c33f","Type":"ContainerDied","Data":"a52a5089f5aebdcc9d2d3a76f4c60be3217714cdb15099ddff2d7e4da3087f59"} Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.130392 4954 scope.go:117] "RemoveContainer" containerID="a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc" Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.130501 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqqwd" Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.164237 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:37 crc kubenswrapper[4954]: I1127 16:53:37.165489 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jqqwd"] Nov 27 16:53:39 crc kubenswrapper[4954]: I1127 16:53:39.056993 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" path="/var/lib/kubelet/pods/bd0710b7-3b77-4f5b-b372-5e89efa2c33f/volumes" Nov 27 16:53:39 crc kubenswrapper[4954]: I1127 16:53:39.060357 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:39 crc kubenswrapper[4954]: I1127 16:53:39.060434 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:39 crc kubenswrapper[4954]: I1127 16:53:39.145074 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:39 crc kubenswrapper[4954]: I1127 16:53:39.214274 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.288324 4954 scope.go:117] "RemoveContainer" containerID="7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.315839 4954 scope.go:117] "RemoveContainer" containerID="a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.340427 4954 scope.go:117] "RemoveContainer" containerID="a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc" Nov 27 16:53:40 crc kubenswrapper[4954]: E1127 16:53:40.341666 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc\": container with ID starting with a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc not found: ID does not exist" containerID="a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.341726 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc"} err="failed to get container status \"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc\": rpc error: code = NotFound desc = could not find container \"a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc\": container with ID starting with a69399b8d0324b7e32c02c7fcfb1fefa5de5806e2241b3e530b1566d29a7b1dc not found: ID does not exist" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.341764 4954 scope.go:117] "RemoveContainer" containerID="7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4" Nov 27 16:53:40 crc kubenswrapper[4954]: E1127 16:53:40.346930 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4\": container with ID starting with 7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4 not found: ID does not exist" containerID="7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.346965 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4"} err="failed to get container status \"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4\": rpc error: code = NotFound desc = could not find container \"7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4\": container with ID starting with 7b7213c712bc80b40a922a2a144627524267cffba5f082cf5aae3775838505d4 not found: ID does not exist" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.346990 4954 scope.go:117] "RemoveContainer" containerID="a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6" Nov 27 16:53:40 crc kubenswrapper[4954]: E1127 16:53:40.347653 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6\": container with ID starting with a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6 not found: ID does not exist" containerID="a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6" Nov 27 16:53:40 crc kubenswrapper[4954]: I1127 16:53:40.347729 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6"} err="failed to get container status \"a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6\": rpc error: code = NotFound desc = could not find container \"a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6\": container with ID starting with a6af9f042014f39e8912a0e4fea010e8f38a61abc734182c3befaa6398b2edc6 not found: ID does not exist" Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.167319 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" event={"ID":"0b1812ac-de14-42bf-acbf-d6a68650bb93","Type":"ContainerStarted","Data":"c7836a1c6f9af561c5bbb84be1c57f90291974ce6e885a4386b24ce95b93ae8d"} Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.167803 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.169696 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" event={"ID":"52ee24fe-968b-440d-8884-5772e253c8b4","Type":"ContainerStarted","Data":"d0f402652a247027bb2db191c9c6597cac6cba780896bee55f97ac94e36dbdc2"} Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.169862 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.210097 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" podStartSLOduration=1.847311685 podStartE2EDuration="8.210076668s" podCreationTimestamp="2025-11-27 16:53:33 +0000 UTC" firstStartedPulling="2025-11-27 16:53:33.978162189 +0000 UTC m=+925.995602509" lastFinishedPulling="2025-11-27 16:53:40.340927192 +0000 UTC m=+932.358367492" observedRunningTime="2025-11-27 16:53:41.209766911 +0000 UTC m=+933.227207211" watchObservedRunningTime="2025-11-27 16:53:41.210076668 +0000 UTC m=+933.227516968" Nov 27 16:53:41 crc kubenswrapper[4954]: I1127 16:53:41.246479 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" podStartSLOduration=1.660049981 podStartE2EDuration="8.246460644s" podCreationTimestamp="2025-11-27 16:53:33 +0000 UTC" firstStartedPulling="2025-11-27 16:53:33.736608853 +0000 UTC m=+925.754049153" lastFinishedPulling="2025-11-27 16:53:40.323019506 +0000 UTC m=+932.340459816" observedRunningTime="2025-11-27 16:53:41.244148058 +0000 UTC m=+933.261588378" watchObservedRunningTime="2025-11-27 16:53:41.246460644 +0000 UTC m=+933.263900944" Nov 27 16:53:42 crc kubenswrapper[4954]: I1127 16:53:42.609315 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:42 crc kubenswrapper[4954]: I1127 16:53:42.609565 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7gd68" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="registry-server" containerID="cri-o://8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53" gracePeriod=2 Nov 27 16:53:42 crc kubenswrapper[4954]: I1127 16:53:42.998567 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.091413 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities\") pod \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.091641 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content\") pod \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.091687 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bz78\" (UniqueName: \"kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78\") pod \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\" (UID: \"43590cab-d1f3-44fe-af1a-5c2fc2c82c78\") " Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.092504 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities" (OuterVolumeSpecName: "utilities") pod "43590cab-d1f3-44fe-af1a-5c2fc2c82c78" (UID: "43590cab-d1f3-44fe-af1a-5c2fc2c82c78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.099794 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78" (OuterVolumeSpecName: "kube-api-access-6bz78") pod "43590cab-d1f3-44fe-af1a-5c2fc2c82c78" (UID: "43590cab-d1f3-44fe-af1a-5c2fc2c82c78"). InnerVolumeSpecName "kube-api-access-6bz78". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.112208 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43590cab-d1f3-44fe-af1a-5c2fc2c82c78" (UID: "43590cab-d1f3-44fe-af1a-5c2fc2c82c78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.193290 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.193680 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bz78\" (UniqueName: \"kubernetes.io/projected/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-kube-api-access-6bz78\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.193694 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43590cab-d1f3-44fe-af1a-5c2fc2c82c78-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.201945 4954 generic.go:334] "Generic (PLEG): container finished" podID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerID="8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53" exitCode=0 Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.202016 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gd68" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.202029 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerDied","Data":"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53"} Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.202532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gd68" event={"ID":"43590cab-d1f3-44fe-af1a-5c2fc2c82c78","Type":"ContainerDied","Data":"09dab455e35cfcd42091bc8ab70f4289551e3b4ac7df7a30733506246b9775bc"} Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.202557 4954 scope.go:117] "RemoveContainer" containerID="8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.221482 4954 scope.go:117] "RemoveContainer" containerID="322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.241604 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.247942 4954 scope.go:117] "RemoveContainer" containerID="96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.251224 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gd68"] Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.271452 4954 scope.go:117] "RemoveContainer" containerID="8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53" Nov 27 16:53:43 crc kubenswrapper[4954]: E1127 16:53:43.272196 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53\": container with ID starting with 8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53 not found: ID does not exist" containerID="8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.272261 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53"} err="failed to get container status \"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53\": rpc error: code = NotFound desc = could not find container \"8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53\": container with ID starting with 8a7183284fb158234ee922da7cc4a3e27668c1b6067c421bd6ba023e2bdbbd53 not found: ID does not exist" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.272299 4954 scope.go:117] "RemoveContainer" containerID="322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636" Nov 27 16:53:43 crc kubenswrapper[4954]: E1127 16:53:43.272937 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636\": container with ID starting with 322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636 not found: ID does not exist" containerID="322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.272987 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636"} err="failed to get container status \"322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636\": rpc error: code = NotFound desc = could not find container \"322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636\": container with ID starting with 322bb1b739eec1e0f4a8aab7b6993b77bf30f9efffa4b82ada1b8a3cb862a636 not found: ID does not exist" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.273024 4954 scope.go:117] "RemoveContainer" containerID="96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14" Nov 27 16:53:43 crc kubenswrapper[4954]: E1127 16:53:43.273711 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14\": container with ID starting with 96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14 not found: ID does not exist" containerID="96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14" Nov 27 16:53:43 crc kubenswrapper[4954]: I1127 16:53:43.273817 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14"} err="failed to get container status \"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14\": rpc error: code = NotFound desc = could not find container \"96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14\": container with ID starting with 96acb81e27e271f8e00d996f1c275d45297bd47582ec9a9232bc9404f1a30f14 not found: ID does not exist" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.670565 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" path="/var/lib/kubelet/pods/43590cab-d1f3-44fe-af1a-5c2fc2c82c78/volumes" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816149 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816434 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="extract-utilities" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816448 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="extract-utilities" Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816461 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="extract-content" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816468 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="extract-content" Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816478 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="extract-content" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816485 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="extract-content" Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816493 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="extract-utilities" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816502 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="extract-utilities" Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816517 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816523 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: E1127 16:53:44.816538 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816544 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816683 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0710b7-3b77-4f5b-b372-5e89efa2c33f" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.816694 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="43590cab-d1f3-44fe-af1a-5c2fc2c82c78" containerName="registry-server" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.817632 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.832664 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.921982 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.922057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:44 crc kubenswrapper[4954]: I1127 16:53:44.922084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rddvm\" (UniqueName: \"kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.023454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.023870 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.024073 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rddvm\" (UniqueName: \"kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.024258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.024861 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.044350 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rddvm\" (UniqueName: \"kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm\") pod \"community-operators-2w494\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.132634 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:45 crc kubenswrapper[4954]: I1127 16:53:45.456307 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:46 crc kubenswrapper[4954]: I1127 16:53:46.228947 4954 generic.go:334] "Generic (PLEG): container finished" podID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerID="b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d" exitCode=0 Nov 27 16:53:46 crc kubenswrapper[4954]: I1127 16:53:46.229023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerDied","Data":"b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d"} Nov 27 16:53:46 crc kubenswrapper[4954]: I1127 16:53:46.229093 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerStarted","Data":"cf8907c955e65bf3fed4c6a540194802a8c496f42102f2d398a6e79dddada9e7"} Nov 27 16:53:48 crc kubenswrapper[4954]: I1127 16:53:48.242746 4954 generic.go:334] "Generic (PLEG): container finished" podID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerID="edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a" exitCode=0 Nov 27 16:53:48 crc kubenswrapper[4954]: I1127 16:53:48.242814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerDied","Data":"edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a"} Nov 27 16:53:49 crc kubenswrapper[4954]: I1127 16:53:49.250794 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerStarted","Data":"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f"} Nov 27 16:53:49 crc kubenswrapper[4954]: I1127 16:53:49.276282 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2w494" podStartSLOduration=2.468965186 podStartE2EDuration="5.276263023s" podCreationTimestamp="2025-11-27 16:53:44 +0000 UTC" firstStartedPulling="2025-11-27 16:53:46.231541421 +0000 UTC m=+938.248981761" lastFinishedPulling="2025-11-27 16:53:49.038839298 +0000 UTC m=+941.056279598" observedRunningTime="2025-11-27 16:53:49.274833558 +0000 UTC m=+941.292273858" watchObservedRunningTime="2025-11-27 16:53:49.276263023 +0000 UTC m=+941.293703323" Nov 27 16:53:53 crc kubenswrapper[4954]: I1127 16:53:53.750551 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c64f4dc9b-sq87v" Nov 27 16:53:55 crc kubenswrapper[4954]: I1127 16:53:55.132832 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:55 crc kubenswrapper[4954]: I1127 16:53:55.132914 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:55 crc kubenswrapper[4954]: I1127 16:53:55.192784 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:55 crc kubenswrapper[4954]: I1127 16:53:55.353480 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:55 crc kubenswrapper[4954]: I1127 16:53:55.438346 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.301451 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2w494" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="registry-server" containerID="cri-o://c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f" gracePeriod=2 Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.761643 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.927168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content\") pod \"05d33bd4-1653-484d-883c-efe96e3bf60d\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.927871 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities\") pod \"05d33bd4-1653-484d-883c-efe96e3bf60d\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.928015 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rddvm\" (UniqueName: \"kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm\") pod \"05d33bd4-1653-484d-883c-efe96e3bf60d\" (UID: \"05d33bd4-1653-484d-883c-efe96e3bf60d\") " Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.929438 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities" (OuterVolumeSpecName: "utilities") pod "05d33bd4-1653-484d-883c-efe96e3bf60d" (UID: "05d33bd4-1653-484d-883c-efe96e3bf60d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.934737 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm" (OuterVolumeSpecName: "kube-api-access-rddvm") pod "05d33bd4-1653-484d-883c-efe96e3bf60d" (UID: "05d33bd4-1653-484d-883c-efe96e3bf60d"). InnerVolumeSpecName "kube-api-access-rddvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:57 crc kubenswrapper[4954]: I1127 16:53:57.981096 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05d33bd4-1653-484d-883c-efe96e3bf60d" (UID: "05d33bd4-1653-484d-883c-efe96e3bf60d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.029830 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.029864 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rddvm\" (UniqueName: \"kubernetes.io/projected/05d33bd4-1653-484d-883c-efe96e3bf60d-kube-api-access-rddvm\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.030004 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d33bd4-1653-484d-883c-efe96e3bf60d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.311170 4954 generic.go:334] "Generic (PLEG): container finished" podID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerID="c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f" exitCode=0 Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.311228 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerDied","Data":"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f"} Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.311265 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w494" event={"ID":"05d33bd4-1653-484d-883c-efe96e3bf60d","Type":"ContainerDied","Data":"cf8907c955e65bf3fed4c6a540194802a8c496f42102f2d398a6e79dddada9e7"} Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.311274 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w494" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.311295 4954 scope.go:117] "RemoveContainer" containerID="c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.326601 4954 scope.go:117] "RemoveContainer" containerID="edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.342640 4954 scope.go:117] "RemoveContainer" containerID="b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.351272 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.358397 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2w494"] Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.367076 4954 scope.go:117] "RemoveContainer" containerID="c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f" Nov 27 16:53:58 crc kubenswrapper[4954]: E1127 16:53:58.367691 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f\": container with ID starting with c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f not found: ID does not exist" containerID="c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.367768 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f"} err="failed to get container status \"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f\": rpc error: code = NotFound desc = could not find container \"c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f\": container with ID starting with c8f5eb4672231a04c4c49e0454a010648403e74cfe916093b372c70e28887e0f not found: ID does not exist" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.367819 4954 scope.go:117] "RemoveContainer" containerID="edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a" Nov 27 16:53:58 crc kubenswrapper[4954]: E1127 16:53:58.368243 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a\": container with ID starting with edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a not found: ID does not exist" containerID="edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.368323 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a"} err="failed to get container status \"edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a\": rpc error: code = NotFound desc = could not find container \"edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a\": container with ID starting with edd42123348bad5894327a18ee31789249ee12c67270a49b70c5353557272c8a not found: ID does not exist" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.368370 4954 scope.go:117] "RemoveContainer" containerID="b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d" Nov 27 16:53:58 crc kubenswrapper[4954]: E1127 16:53:58.368837 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d\": container with ID starting with b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d not found: ID does not exist" containerID="b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.368882 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d"} err="failed to get container status \"b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d\": rpc error: code = NotFound desc = could not find container \"b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d\": container with ID starting with b3281bee0a5258c4b9ce2e905ed6e1bf81d9828619e409450f5e05e5f2d6536d not found: ID does not exist" Nov 27 16:53:58 crc kubenswrapper[4954]: I1127 16:53:58.670010 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" path="/var/lib/kubelet/pods/05d33bd4-1653-484d-883c-efe96e3bf60d/volumes" Nov 27 16:54:13 crc kubenswrapper[4954]: I1127 16:54:13.392325 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5d9ff7464f-4f4jv" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.157559 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24"] Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.157888 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="registry-server" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.157904 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="registry-server" Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.157919 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="extract-content" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.157926 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="extract-content" Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.157945 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="extract-utilities" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.157954 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="extract-utilities" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.158080 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d33bd4-1653-484d-883c-efe96e3bf60d" containerName="registry-server" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.158657 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.164012 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.164056 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fvh7h" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.173204 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9psn7"] Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.176290 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.179448 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24"] Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.181075 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.188554 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.268648 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ql5zn"] Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.269848 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.272104 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.274528 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.275465 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-n6crr" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.278078 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.290161 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-l6mbj"] Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.293383 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.296987 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297373 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/008cad91-d45f-4942-9e82-239acf3fb8ed-metallb-excludel2\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297459 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-metrics-certs\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297491 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlthc\" (UniqueName: \"kubernetes.io/projected/008cad91-d45f-4942-9e82-239acf3fb8ed-kube-api-access-tlthc\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297528 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-reloader\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297553 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqkm\" (UniqueName: \"kubernetes.io/projected/6792e473-15c3-405b-8c32-007e421b40c6-kube-api-access-mjqkm\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297616 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6792e473-15c3-405b-8c32-007e421b40c6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297667 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics-certs\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297708 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-sockets\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297827 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-startup\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297926 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fth2r\" (UniqueName: \"kubernetes.io/projected/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-kube-api-access-fth2r\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.297981 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-cert\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.298010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5b94\" (UniqueName: \"kubernetes.io/projected/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-kube-api-access-k5b94\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.298051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.298083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-conf\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.300071 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-l6mbj"] Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400074 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-conf\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400180 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/008cad91-d45f-4942-9e82-239acf3fb8ed-metallb-excludel2\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400205 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-metrics-certs\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400256 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlthc\" (UniqueName: \"kubernetes.io/projected/008cad91-d45f-4942-9e82-239acf3fb8ed-kube-api-access-tlthc\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400279 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-reloader\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400297 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqkm\" (UniqueName: \"kubernetes.io/projected/6792e473-15c3-405b-8c32-007e421b40c6-kube-api-access-mjqkm\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400320 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6792e473-15c3-405b-8c32-007e421b40c6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400377 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics-certs\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-sockets\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400427 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400449 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-startup\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400471 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fth2r\" (UniqueName: \"kubernetes.io/projected/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-kube-api-access-fth2r\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400491 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400514 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-cert\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400537 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5b94\" (UniqueName: \"kubernetes.io/projected/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-kube-api-access-k5b94\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400561 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.400769 4954 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.400775 4954 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.400961 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-sockets\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.401225 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/008cad91-d45f-4942-9e82-239acf3fb8ed-metallb-excludel2\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.401306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.401699 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-reloader\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.401742 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-startup\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.402133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-frr-conf\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.403294 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.403508 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist podName:008cad91-d45f-4942-9e82-239acf3fb8ed nodeName:}" failed. No retries permitted until 2025-11-27 16:54:14.900809741 +0000 UTC m=+966.918250041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist") pod "speaker-ql5zn" (UID: "008cad91-d45f-4942-9e82-239acf3fb8ed") : secret "metallb-memberlist" not found Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.403864 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs podName:33df22a6-6a0f-445c-8a77-ad9cfb09d3d4 nodeName:}" failed. No retries permitted until 2025-11-27 16:54:14.903851426 +0000 UTC m=+966.921291736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs") pod "controller-f8648f98b-l6mbj" (UID: "33df22a6-6a0f-445c-8a77-ad9cfb09d3d4") : secret "controller-certs-secret" not found Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.407098 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-metrics-certs\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.414235 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6792e473-15c3-405b-8c32-007e421b40c6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.414527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-cert\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.422166 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-metrics-certs\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.425987 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqkm\" (UniqueName: \"kubernetes.io/projected/6792e473-15c3-405b-8c32-007e421b40c6-kube-api-access-mjqkm\") pod \"frr-k8s-webhook-server-7fcb986d4-f6v24\" (UID: \"6792e473-15c3-405b-8c32-007e421b40c6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.426662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fth2r\" (UniqueName: \"kubernetes.io/projected/7cde0cd2-0d4c-411e-b857-8488be2e2f0f-kube-api-access-fth2r\") pod \"frr-k8s-9psn7\" (UID: \"7cde0cd2-0d4c-411e-b857-8488be2e2f0f\") " pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.426805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5b94\" (UniqueName: \"kubernetes.io/projected/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-kube-api-access-k5b94\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.427952 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlthc\" (UniqueName: \"kubernetes.io/projected/008cad91-d45f-4942-9e82-239acf3fb8ed-kube-api-access-tlthc\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.476326 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.495729 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.897481 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24"] Nov 27 16:54:14 crc kubenswrapper[4954]: W1127 16:54:14.904262 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6792e473_15c3_405b_8c32_007e421b40c6.slice/crio-e75ac40f3226b3632c978cc0d1a29c581d1fd78dbbb61d67cd0aa380abfe9558 WatchSource:0}: Error finding container e75ac40f3226b3632c978cc0d1a29c581d1fd78dbbb61d67cd0aa380abfe9558: Status 404 returned error can't find the container with id e75ac40f3226b3632c978cc0d1a29c581d1fd78dbbb61d67cd0aa380abfe9558 Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.911277 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.911446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.911456 4954 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 16:54:14 crc kubenswrapper[4954]: E1127 16:54:14.911672 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist podName:008cad91-d45f-4942-9e82-239acf3fb8ed nodeName:}" failed. No retries permitted until 2025-11-27 16:54:15.911645427 +0000 UTC m=+967.929085767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist") pod "speaker-ql5zn" (UID: "008cad91-d45f-4942-9e82-239acf3fb8ed") : secret "metallb-memberlist" not found Nov 27 16:54:14 crc kubenswrapper[4954]: I1127 16:54:14.917067 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33df22a6-6a0f-445c-8a77-ad9cfb09d3d4-metrics-certs\") pod \"controller-f8648f98b-l6mbj\" (UID: \"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4\") " pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.209142 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.433878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" event={"ID":"6792e473-15c3-405b-8c32-007e421b40c6","Type":"ContainerStarted","Data":"e75ac40f3226b3632c978cc0d1a29c581d1fd78dbbb61d67cd0aa380abfe9558"} Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.435345 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"eeede0ee188860cb879f30584619dcf0fcd165eaf22e50db2f37c4ee8ddeb3bb"} Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.479016 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-l6mbj"] Nov 27 16:54:15 crc kubenswrapper[4954]: W1127 16:54:15.483082 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33df22a6_6a0f_445c_8a77_ad9cfb09d3d4.slice/crio-924634306fb770c4b3e12a8ddefd611f7d94764a12140e8e0c47744d6a82ac60 WatchSource:0}: Error finding container 924634306fb770c4b3e12a8ddefd611f7d94764a12140e8e0c47744d6a82ac60: Status 404 returned error can't find the container with id 924634306fb770c4b3e12a8ddefd611f7d94764a12140e8e0c47744d6a82ac60 Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.927755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:15 crc kubenswrapper[4954]: I1127 16:54:15.933888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/008cad91-d45f-4942-9e82-239acf3fb8ed-memberlist\") pod \"speaker-ql5zn\" (UID: \"008cad91-d45f-4942-9e82-239acf3fb8ed\") " pod="metallb-system/speaker-ql5zn" Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.085518 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ql5zn" Nov 27 16:54:16 crc kubenswrapper[4954]: W1127 16:54:16.105567 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008cad91_d45f_4942_9e82_239acf3fb8ed.slice/crio-b732eb8ee22a8fcda78d7235b6ad1ce221addf2018171f0aead9eeb3b4a24d75 WatchSource:0}: Error finding container b732eb8ee22a8fcda78d7235b6ad1ce221addf2018171f0aead9eeb3b4a24d75: Status 404 returned error can't find the container with id b732eb8ee22a8fcda78d7235b6ad1ce221addf2018171f0aead9eeb3b4a24d75 Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.443018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ql5zn" event={"ID":"008cad91-d45f-4942-9e82-239acf3fb8ed","Type":"ContainerStarted","Data":"2814d092b3849a96f8014fbe5eef757c60424405ebc314ede536ec516c1ef26f"} Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.443958 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ql5zn" event={"ID":"008cad91-d45f-4942-9e82-239acf3fb8ed","Type":"ContainerStarted","Data":"b732eb8ee22a8fcda78d7235b6ad1ce221addf2018171f0aead9eeb3b4a24d75"} Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.445247 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-l6mbj" event={"ID":"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4","Type":"ContainerStarted","Data":"ee790dddf8edf438b20c89aac755e6852540e2245df6e4d56070a5c6333910ef"} Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.445370 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-l6mbj" event={"ID":"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4","Type":"ContainerStarted","Data":"5d758665f41a1a6b6105f5250ec3ce961db477fc876947d3cfe742d6a316fbd3"} Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.445433 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-l6mbj" event={"ID":"33df22a6-6a0f-445c-8a77-ad9cfb09d3d4","Type":"ContainerStarted","Data":"924634306fb770c4b3e12a8ddefd611f7d94764a12140e8e0c47744d6a82ac60"} Nov 27 16:54:16 crc kubenswrapper[4954]: I1127 16:54:16.446735 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:17 crc kubenswrapper[4954]: I1127 16:54:17.457883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ql5zn" event={"ID":"008cad91-d45f-4942-9e82-239acf3fb8ed","Type":"ContainerStarted","Data":"d35158a6f4e6bc3b7341fc93edd62a79ed1e7324ad4f39a44d0d063f8e9d38e7"} Nov 27 16:54:17 crc kubenswrapper[4954]: I1127 16:54:17.482742 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-l6mbj" podStartSLOduration=3.48272234 podStartE2EDuration="3.48272234s" podCreationTimestamp="2025-11-27 16:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:54:16.466543461 +0000 UTC m=+968.483983761" watchObservedRunningTime="2025-11-27 16:54:17.48272234 +0000 UTC m=+969.500162640" Nov 27 16:54:17 crc kubenswrapper[4954]: I1127 16:54:17.484199 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ql5zn" podStartSLOduration=3.484191436 podStartE2EDuration="3.484191436s" podCreationTimestamp="2025-11-27 16:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:54:17.480752401 +0000 UTC m=+969.498192701" watchObservedRunningTime="2025-11-27 16:54:17.484191436 +0000 UTC m=+969.501631736" Nov 27 16:54:18 crc kubenswrapper[4954]: I1127 16:54:18.464223 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ql5zn" Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.497691 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" event={"ID":"6792e473-15c3-405b-8c32-007e421b40c6","Type":"ContainerStarted","Data":"60072634db7761c5dfca4a513de3ac38a6dd01343bfb60fe20692a1e02cac86f"} Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.498190 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.499367 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cde0cd2-0d4c-411e-b857-8488be2e2f0f" containerID="b1cd16b2837eb6b8a6b3a27185ca5d32f42c3e142925760056b335110ba020e6" exitCode=0 Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.499409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerDied","Data":"b1cd16b2837eb6b8a6b3a27185ca5d32f42c3e142925760056b335110ba020e6"} Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.529923 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" podStartSLOduration=2.026670474 podStartE2EDuration="9.529898433s" podCreationTimestamp="2025-11-27 16:54:14 +0000 UTC" firstStartedPulling="2025-11-27 16:54:14.907558478 +0000 UTC m=+966.924998788" lastFinishedPulling="2025-11-27 16:54:22.410786437 +0000 UTC m=+974.428226747" observedRunningTime="2025-11-27 16:54:23.523556779 +0000 UTC m=+975.540997109" watchObservedRunningTime="2025-11-27 16:54:23.529898433 +0000 UTC m=+975.547338733" Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.687535 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:54:23 crc kubenswrapper[4954]: I1127 16:54:23.688211 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:54:24 crc kubenswrapper[4954]: I1127 16:54:24.513082 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cde0cd2-0d4c-411e-b857-8488be2e2f0f" containerID="9e2ef7ec9fac62d6265eaa31de440c490242e856dc09a72a861f22e215891309" exitCode=0 Nov 27 16:54:24 crc kubenswrapper[4954]: I1127 16:54:24.513315 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerDied","Data":"9e2ef7ec9fac62d6265eaa31de440c490242e856dc09a72a861f22e215891309"} Nov 27 16:54:25 crc kubenswrapper[4954]: I1127 16:54:25.214688 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-l6mbj" Nov 27 16:54:25 crc kubenswrapper[4954]: I1127 16:54:25.525171 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cde0cd2-0d4c-411e-b857-8488be2e2f0f" containerID="f3ba2e7a1918ab897dca1c0f2d3d0fe38982bec3312534125e155f9a2d816adf" exitCode=0 Nov 27 16:54:25 crc kubenswrapper[4954]: I1127 16:54:25.525224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerDied","Data":"f3ba2e7a1918ab897dca1c0f2d3d0fe38982bec3312534125e155f9a2d816adf"} Nov 27 16:54:26 crc kubenswrapper[4954]: I1127 16:54:26.089880 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ql5zn" Nov 27 16:54:26 crc kubenswrapper[4954]: I1127 16:54:26.539512 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"9e19a2157b2f8f58ff1761f92e386c6257f1e27499fc2acd575b32340e2d6eb5"} Nov 27 16:54:26 crc kubenswrapper[4954]: I1127 16:54:26.540048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"a29386c86db15e10dfc2548185e720c4c637a23dc9de370f925ee498137f5e29"} Nov 27 16:54:26 crc kubenswrapper[4954]: I1127 16:54:26.540067 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"52b27c4b42a1025fdced87f9f9c6509abcd1a2722e3c0c28ab7793df2af3740e"} Nov 27 16:54:26 crc kubenswrapper[4954]: I1127 16:54:26.540087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"d041eba4c6edee584c47f51b1eeb9f3efd80d49409e192c2e75fdf802d92691d"} Nov 27 16:54:27 crc kubenswrapper[4954]: I1127 16:54:27.554078 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"7527d035c9f33e83bdd34d9cea81b30a0703a605a380ddd4cdce790bacc64d8a"} Nov 27 16:54:27 crc kubenswrapper[4954]: I1127 16:54:27.554170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9psn7" event={"ID":"7cde0cd2-0d4c-411e-b857-8488be2e2f0f","Type":"ContainerStarted","Data":"50f4023c5471c595a2b3a681a235e6d686a0e0fb63bf0d549eab5e2fc506017e"} Nov 27 16:54:27 crc kubenswrapper[4954]: I1127 16:54:27.554379 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:27 crc kubenswrapper[4954]: I1127 16:54:27.593495 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9psn7" podStartSLOduration=5.82565498 podStartE2EDuration="13.593463935s" podCreationTimestamp="2025-11-27 16:54:14 +0000 UTC" firstStartedPulling="2025-11-27 16:54:14.660790396 +0000 UTC m=+966.678230696" lastFinishedPulling="2025-11-27 16:54:22.428599351 +0000 UTC m=+974.446039651" observedRunningTime="2025-11-27 16:54:27.589125409 +0000 UTC m=+979.606565709" watchObservedRunningTime="2025-11-27 16:54:27.593463935 +0000 UTC m=+979.610904285" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.296789 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.298386 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.305626 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.307010 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6nq5g" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.316737 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.373932 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.459845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9czd\" (UniqueName: \"kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd\") pod \"openstack-operator-index-4ntf4\" (UID: \"47edfd3e-ca63-45b6-9217-e889ef8bc857\") " pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.496295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.534182 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.561196 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9czd\" (UniqueName: \"kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd\") pod \"openstack-operator-index-4ntf4\" (UID: \"47edfd3e-ca63-45b6-9217-e889ef8bc857\") " pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.591451 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9czd\" (UniqueName: \"kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd\") pod \"openstack-operator-index-4ntf4\" (UID: \"47edfd3e-ca63-45b6-9217-e889ef8bc857\") " pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:29 crc kubenswrapper[4954]: I1127 16:54:29.619044 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:30 crc kubenswrapper[4954]: I1127 16:54:30.042988 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:30 crc kubenswrapper[4954]: I1127 16:54:30.576070 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4ntf4" event={"ID":"47edfd3e-ca63-45b6-9217-e889ef8bc857","Type":"ContainerStarted","Data":"08ed1b2f870c4f55566537cb023246c5eaa9096fc569949d04ca12dd865d550d"} Nov 27 16:54:32 crc kubenswrapper[4954]: I1127 16:54:32.659243 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.259625 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xhxgn"] Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.260841 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.275100 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xhxgn"] Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.334032 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jz8\" (UniqueName: \"kubernetes.io/projected/e300d0a8-a678-4065-bbcd-a886791e9e1a-kube-api-access-z5jz8\") pod \"openstack-operator-index-xhxgn\" (UID: \"e300d0a8-a678-4065-bbcd-a886791e9e1a\") " pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.434984 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jz8\" (UniqueName: \"kubernetes.io/projected/e300d0a8-a678-4065-bbcd-a886791e9e1a-kube-api-access-z5jz8\") pod \"openstack-operator-index-xhxgn\" (UID: \"e300d0a8-a678-4065-bbcd-a886791e9e1a\") " pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.463753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jz8\" (UniqueName: \"kubernetes.io/projected/e300d0a8-a678-4065-bbcd-a886791e9e1a-kube-api-access-z5jz8\") pod \"openstack-operator-index-xhxgn\" (UID: \"e300d0a8-a678-4065-bbcd-a886791e9e1a\") " pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.579553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.597345 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4ntf4" event={"ID":"47edfd3e-ca63-45b6-9217-e889ef8bc857","Type":"ContainerStarted","Data":"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2"} Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.597459 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-4ntf4" podUID="47edfd3e-ca63-45b6-9217-e889ef8bc857" containerName="registry-server" containerID="cri-o://e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2" gracePeriod=2 Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.632228 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4ntf4" podStartSLOduration=2.039132627 podStartE2EDuration="4.632204953s" podCreationTimestamp="2025-11-27 16:54:29 +0000 UTC" firstStartedPulling="2025-11-27 16:54:30.060757472 +0000 UTC m=+982.078197772" lastFinishedPulling="2025-11-27 16:54:32.653829798 +0000 UTC m=+984.671270098" observedRunningTime="2025-11-27 16:54:33.629013095 +0000 UTC m=+985.646453455" watchObservedRunningTime="2025-11-27 16:54:33.632204953 +0000 UTC m=+985.649645263" Nov 27 16:54:33 crc kubenswrapper[4954]: I1127 16:54:33.872998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xhxgn"] Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.073427 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.160778 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9czd\" (UniqueName: \"kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd\") pod \"47edfd3e-ca63-45b6-9217-e889ef8bc857\" (UID: \"47edfd3e-ca63-45b6-9217-e889ef8bc857\") " Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.168907 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd" (OuterVolumeSpecName: "kube-api-access-b9czd") pod "47edfd3e-ca63-45b6-9217-e889ef8bc857" (UID: "47edfd3e-ca63-45b6-9217-e889ef8bc857"). InnerVolumeSpecName "kube-api-access-b9czd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.262524 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9czd\" (UniqueName: \"kubernetes.io/projected/47edfd3e-ca63-45b6-9217-e889ef8bc857-kube-api-access-b9czd\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.481314 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-f6v24" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.603519 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xhxgn" event={"ID":"e300d0a8-a678-4065-bbcd-a886791e9e1a","Type":"ContainerStarted","Data":"e0e7b8306853a4450ecc407b43201123d4862d7f89b54b5b159c3653fe29934f"} Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.603594 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xhxgn" event={"ID":"e300d0a8-a678-4065-bbcd-a886791e9e1a","Type":"ContainerStarted","Data":"47b7e9a7e2742b3df42da836f3170889573a146e234bc11d8b5f392f4aaeef45"} Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.606853 4954 generic.go:334] "Generic (PLEG): container finished" podID="47edfd3e-ca63-45b6-9217-e889ef8bc857" containerID="e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2" exitCode=0 Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.606900 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4ntf4" event={"ID":"47edfd3e-ca63-45b6-9217-e889ef8bc857","Type":"ContainerDied","Data":"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2"} Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.606927 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4ntf4" event={"ID":"47edfd3e-ca63-45b6-9217-e889ef8bc857","Type":"ContainerDied","Data":"08ed1b2f870c4f55566537cb023246c5eaa9096fc569949d04ca12dd865d550d"} Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.606945 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4ntf4" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.606950 4954 scope.go:117] "RemoveContainer" containerID="e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.622708 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xhxgn" podStartSLOduration=1.5171477850000001 podStartE2EDuration="1.622676802s" podCreationTimestamp="2025-11-27 16:54:33 +0000 UTC" firstStartedPulling="2025-11-27 16:54:33.926820578 +0000 UTC m=+985.944260878" lastFinishedPulling="2025-11-27 16:54:34.032349595 +0000 UTC m=+986.049789895" observedRunningTime="2025-11-27 16:54:34.621376981 +0000 UTC m=+986.638817291" watchObservedRunningTime="2025-11-27 16:54:34.622676802 +0000 UTC m=+986.640117122" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.628728 4954 scope.go:117] "RemoveContainer" containerID="e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2" Nov 27 16:54:34 crc kubenswrapper[4954]: E1127 16:54:34.629428 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2\": container with ID starting with e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2 not found: ID does not exist" containerID="e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.629479 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2"} err="failed to get container status \"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2\": rpc error: code = NotFound desc = could not find container \"e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2\": container with ID starting with e68d791a310be28368cb30e5b173561c8e7c418678253e1ff40584e7f64155f2 not found: ID does not exist" Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.647062 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:34 crc kubenswrapper[4954]: I1127 16:54:34.674625 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-4ntf4"] Nov 27 16:54:36 crc kubenswrapper[4954]: I1127 16:54:36.669956 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47edfd3e-ca63-45b6-9217-e889ef8bc857" path="/var/lib/kubelet/pods/47edfd3e-ca63-45b6-9217-e889ef8bc857/volumes" Nov 27 16:54:43 crc kubenswrapper[4954]: I1127 16:54:43.580076 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:43 crc kubenswrapper[4954]: I1127 16:54:43.580837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:43 crc kubenswrapper[4954]: I1127 16:54:43.613703 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:43 crc kubenswrapper[4954]: I1127 16:54:43.713963 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xhxgn" Nov 27 16:54:44 crc kubenswrapper[4954]: I1127 16:54:44.499677 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9psn7" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.589241 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8"] Nov 27 16:54:50 crc kubenswrapper[4954]: E1127 16:54:50.589847 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47edfd3e-ca63-45b6-9217-e889ef8bc857" containerName="registry-server" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.589864 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="47edfd3e-ca63-45b6-9217-e889ef8bc857" containerName="registry-server" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.589991 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="47edfd3e-ca63-45b6-9217-e889ef8bc857" containerName="registry-server" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.590996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.594107 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5k2fd" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.608266 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8"] Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.641565 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.641818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.641990 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnll\" (UniqueName: \"kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.743562 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.743679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnll\" (UniqueName: \"kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.743741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.744173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.744195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.772485 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnll\" (UniqueName: \"kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll\") pod \"7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:50 crc kubenswrapper[4954]: I1127 16:54:50.909209 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:51 crc kubenswrapper[4954]: I1127 16:54:51.194376 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8"] Nov 27 16:54:51 crc kubenswrapper[4954]: I1127 16:54:51.750101 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" event={"ID":"4f0a14af-754e-4601-aadc-77e1a310c088","Type":"ContainerStarted","Data":"3736c4e4ba49636fe5211904f1a1c475e55ba2e2e6f69def84c0c1465f4e6cb1"} Nov 27 16:54:52 crc kubenswrapper[4954]: I1127 16:54:52.759690 4954 generic.go:334] "Generic (PLEG): container finished" podID="4f0a14af-754e-4601-aadc-77e1a310c088" containerID="27cce55c2785335d21616c626c69ad7c179c4967c0c10ecaecd8a6a54be93897" exitCode=0 Nov 27 16:54:52 crc kubenswrapper[4954]: I1127 16:54:52.759764 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" event={"ID":"4f0a14af-754e-4601-aadc-77e1a310c088","Type":"ContainerDied","Data":"27cce55c2785335d21616c626c69ad7c179c4967c0c10ecaecd8a6a54be93897"} Nov 27 16:54:53 crc kubenswrapper[4954]: I1127 16:54:53.687974 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:54:53 crc kubenswrapper[4954]: I1127 16:54:53.688365 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:54:53 crc kubenswrapper[4954]: I1127 16:54:53.773008 4954 generic.go:334] "Generic (PLEG): container finished" podID="4f0a14af-754e-4601-aadc-77e1a310c088" containerID="361a6328aeee853d63b9311764030ce4643f8e934ac6f55c3ddf19e547867d97" exitCode=0 Nov 27 16:54:53 crc kubenswrapper[4954]: I1127 16:54:53.773083 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" event={"ID":"4f0a14af-754e-4601-aadc-77e1a310c088","Type":"ContainerDied","Data":"361a6328aeee853d63b9311764030ce4643f8e934ac6f55c3ddf19e547867d97"} Nov 27 16:54:54 crc kubenswrapper[4954]: I1127 16:54:54.783903 4954 generic.go:334] "Generic (PLEG): container finished" podID="4f0a14af-754e-4601-aadc-77e1a310c088" containerID="08224e466440fd91864f83f555890ac95e22fc9ee5a0f08531231f5e9150114d" exitCode=0 Nov 27 16:54:54 crc kubenswrapper[4954]: I1127 16:54:54.784003 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" event={"ID":"4f0a14af-754e-4601-aadc-77e1a310c088","Type":"ContainerDied","Data":"08224e466440fd91864f83f555890ac95e22fc9ee5a0f08531231f5e9150114d"} Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.098290 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.133670 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util\") pod \"4f0a14af-754e-4601-aadc-77e1a310c088\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.133852 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle\") pod \"4f0a14af-754e-4601-aadc-77e1a310c088\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.134304 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhnll\" (UniqueName: \"kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll\") pod \"4f0a14af-754e-4601-aadc-77e1a310c088\" (UID: \"4f0a14af-754e-4601-aadc-77e1a310c088\") " Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.135201 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle" (OuterVolumeSpecName: "bundle") pod "4f0a14af-754e-4601-aadc-77e1a310c088" (UID: "4f0a14af-754e-4601-aadc-77e1a310c088"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.144826 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll" (OuterVolumeSpecName: "kube-api-access-zhnll") pod "4f0a14af-754e-4601-aadc-77e1a310c088" (UID: "4f0a14af-754e-4601-aadc-77e1a310c088"). InnerVolumeSpecName "kube-api-access-zhnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.149331 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util" (OuterVolumeSpecName: "util") pod "4f0a14af-754e-4601-aadc-77e1a310c088" (UID: "4f0a14af-754e-4601-aadc-77e1a310c088"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.236694 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.236736 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhnll\" (UniqueName: \"kubernetes.io/projected/4f0a14af-754e-4601-aadc-77e1a310c088-kube-api-access-zhnll\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.236747 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f0a14af-754e-4601-aadc-77e1a310c088-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.807827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" event={"ID":"4f0a14af-754e-4601-aadc-77e1a310c088","Type":"ContainerDied","Data":"3736c4e4ba49636fe5211904f1a1c475e55ba2e2e6f69def84c0c1465f4e6cb1"} Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.807905 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3736c4e4ba49636fe5211904f1a1c475e55ba2e2e6f69def84c0c1465f4e6cb1" Nov 27 16:54:56 crc kubenswrapper[4954]: I1127 16:54:56.807955 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.623841 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch"] Nov 27 16:55:02 crc kubenswrapper[4954]: E1127 16:55:02.624826 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="extract" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.624844 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="extract" Nov 27 16:55:02 crc kubenswrapper[4954]: E1127 16:55:02.624861 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="pull" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.624872 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="pull" Nov 27 16:55:02 crc kubenswrapper[4954]: E1127 16:55:02.624889 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="util" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.624899 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="util" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.625031 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0a14af-754e-4601-aadc-77e1a310c088" containerName="extract" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.625649 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.630368 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-lrz8m" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.633697 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brt77\" (UniqueName: \"kubernetes.io/projected/67df4fc6-9215-4441-955b-d7d740c5db1e-kube-api-access-brt77\") pod \"openstack-operator-controller-operator-757f5977c4-9sxch\" (UID: \"67df4fc6-9215-4441-955b-d7d740c5db1e\") " pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.673010 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch"] Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.735111 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brt77\" (UniqueName: \"kubernetes.io/projected/67df4fc6-9215-4441-955b-d7d740c5db1e-kube-api-access-brt77\") pod \"openstack-operator-controller-operator-757f5977c4-9sxch\" (UID: \"67df4fc6-9215-4441-955b-d7d740c5db1e\") " pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.768683 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brt77\" (UniqueName: \"kubernetes.io/projected/67df4fc6-9215-4441-955b-d7d740c5db1e-kube-api-access-brt77\") pod \"openstack-operator-controller-operator-757f5977c4-9sxch\" (UID: \"67df4fc6-9215-4441-955b-d7d740c5db1e\") " pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:02 crc kubenswrapper[4954]: I1127 16:55:02.943563 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:03 crc kubenswrapper[4954]: I1127 16:55:03.383472 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch"] Nov 27 16:55:03 crc kubenswrapper[4954]: W1127 16:55:03.391232 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67df4fc6_9215_4441_955b_d7d740c5db1e.slice/crio-6418b4c711af1aa8a8486248fb5d3f9d07c1cf5370e097ba4970a5aafcf7ce1b WatchSource:0}: Error finding container 6418b4c711af1aa8a8486248fb5d3f9d07c1cf5370e097ba4970a5aafcf7ce1b: Status 404 returned error can't find the container with id 6418b4c711af1aa8a8486248fb5d3f9d07c1cf5370e097ba4970a5aafcf7ce1b Nov 27 16:55:03 crc kubenswrapper[4954]: I1127 16:55:03.869207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" event={"ID":"67df4fc6-9215-4441-955b-d7d740c5db1e","Type":"ContainerStarted","Data":"6418b4c711af1aa8a8486248fb5d3f9d07c1cf5370e097ba4970a5aafcf7ce1b"} Nov 27 16:55:07 crc kubenswrapper[4954]: I1127 16:55:07.898209 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" event={"ID":"67df4fc6-9215-4441-955b-d7d740c5db1e","Type":"ContainerStarted","Data":"08fede486f7ab92b74c5f3c463753c21cc0bd6407f0221607456d82a2334f2f9"} Nov 27 16:55:07 crc kubenswrapper[4954]: I1127 16:55:07.899092 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:07 crc kubenswrapper[4954]: I1127 16:55:07.935809 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" podStartSLOduration=2.39913929 podStartE2EDuration="5.935790466s" podCreationTimestamp="2025-11-27 16:55:02 +0000 UTC" firstStartedPulling="2025-11-27 16:55:03.393155563 +0000 UTC m=+1015.410595863" lastFinishedPulling="2025-11-27 16:55:06.929806739 +0000 UTC m=+1018.947247039" observedRunningTime="2025-11-27 16:55:07.930884176 +0000 UTC m=+1019.948324526" watchObservedRunningTime="2025-11-27 16:55:07.935790466 +0000 UTC m=+1019.953230766" Nov 27 16:55:12 crc kubenswrapper[4954]: I1127 16:55:12.946340 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" Nov 27 16:55:23 crc kubenswrapper[4954]: I1127 16:55:23.687403 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:55:23 crc kubenswrapper[4954]: I1127 16:55:23.687987 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:55:23 crc kubenswrapper[4954]: I1127 16:55:23.688041 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:55:23 crc kubenswrapper[4954]: I1127 16:55:23.688615 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:55:23 crc kubenswrapper[4954]: I1127 16:55:23.688683 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b" gracePeriod=600 Nov 27 16:55:25 crc kubenswrapper[4954]: I1127 16:55:25.044802 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b" exitCode=0 Nov 27 16:55:25 crc kubenswrapper[4954]: I1127 16:55:25.044886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b"} Nov 27 16:55:25 crc kubenswrapper[4954]: I1127 16:55:25.045360 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513"} Nov 27 16:55:25 crc kubenswrapper[4954]: I1127 16:55:25.045395 4954 scope.go:117] "RemoveContainer" containerID="f253421b54ffaa5b8245af0010b5935a685f03c65cd9227baccbf0b03f627cdd" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.604800 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.607478 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.615660 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7lw9l" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.625329 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.626710 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.630226 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dbqpk" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.631422 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.637520 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.658868 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-dzjch"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.659931 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:55:50 crc kubenswrapper[4954]: W1127 16:55:50.662330 4954 reflector.go:561] object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4rgrq": failed to list *v1.Secret: secrets "designate-operator-controller-manager-dockercfg-4rgrq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Nov 27 16:55:50 crc kubenswrapper[4954]: E1127 16:55:50.662394 4954 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"designate-operator-controller-manager-dockercfg-4rgrq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"designate-operator-controller-manager-dockercfg-4rgrq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.685362 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.686644 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.693154 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-dzjch"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.698482 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-svp6m" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.702509 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.703886 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.706242 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6g48h" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.709218 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.720683 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.728682 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.745117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr84l\" (UniqueName: \"kubernetes.io/projected/50ec526e-d6db-45fa-8b99-bd795b4c3690-kube-api-access-vr84l\") pod \"barbican-operator-controller-manager-7b64f4fb85-nz28b\" (UID: \"50ec526e-d6db-45fa-8b99-bd795b4c3690\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.745227 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srsfc\" (UniqueName: \"kubernetes.io/projected/a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95-kube-api-access-srsfc\") pod \"designate-operator-controller-manager-955677c94-dzjch\" (UID: \"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.745276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cffwb\" (UniqueName: \"kubernetes.io/projected/11ca1308-8c7a-4a3d-a283-2533abc54c25-kube-api-access-cffwb\") pod \"cinder-operator-controller-manager-6b7f75547b-4rg5t\" (UID: \"11ca1308-8c7a-4a3d-a283-2533abc54c25\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.746854 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mmlvc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.748922 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.750382 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.756164 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5bxkf" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.756445 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.782843 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.807048 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.834666 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.836074 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.842875 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-s24fz" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l599k\" (UniqueName: \"kubernetes.io/projected/5884eab6-e3c0-45de-b93d-73392533b780-kube-api-access-l599k\") pod \"heat-operator-controller-manager-5b77f656f-zlt7m\" (UID: \"5884eab6-e3c0-45de-b93d-73392533b780\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846467 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846505 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djlk\" (UniqueName: \"kubernetes.io/projected/736ef0f4-e471-4acd-8569-2a6d6d260f67-kube-api-access-4djlk\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846537 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr84l\" (UniqueName: \"kubernetes.io/projected/50ec526e-d6db-45fa-8b99-bd795b4c3690-kube-api-access-vr84l\") pod \"barbican-operator-controller-manager-7b64f4fb85-nz28b\" (UID: \"50ec526e-d6db-45fa-8b99-bd795b4c3690\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846596 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srsfc\" (UniqueName: \"kubernetes.io/projected/a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95-kube-api-access-srsfc\") pod \"designate-operator-controller-manager-955677c94-dzjch\" (UID: \"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cffwb\" (UniqueName: \"kubernetes.io/projected/11ca1308-8c7a-4a3d-a283-2533abc54c25-kube-api-access-cffwb\") pod \"cinder-operator-controller-manager-6b7f75547b-4rg5t\" (UID: \"11ca1308-8c7a-4a3d-a283-2533abc54c25\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846741 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq5d9\" (UniqueName: \"kubernetes.io/projected/7c8dd8cc-7be7-41f9-ac93-139dc9e83274-kube-api-access-gq5d9\") pod \"glance-operator-controller-manager-589cbd6b5b-8ghg2\" (UID: \"7c8dd8cc-7be7-41f9-ac93-139dc9e83274\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.846797 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsn8h\" (UniqueName: \"kubernetes.io/projected/c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a-kube-api-access-lsn8h\") pod \"horizon-operator-controller-manager-5d494799bf-nnj6l\" (UID: \"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.883683 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cffwb\" (UniqueName: \"kubernetes.io/projected/11ca1308-8c7a-4a3d-a283-2533abc54c25-kube-api-access-cffwb\") pod \"cinder-operator-controller-manager-6b7f75547b-4rg5t\" (UID: \"11ca1308-8c7a-4a3d-a283-2533abc54c25\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.889014 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.892056 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr84l\" (UniqueName: \"kubernetes.io/projected/50ec526e-d6db-45fa-8b99-bd795b4c3690-kube-api-access-vr84l\") pod \"barbican-operator-controller-manager-7b64f4fb85-nz28b\" (UID: \"50ec526e-d6db-45fa-8b99-bd795b4c3690\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.892482 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.894373 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srsfc\" (UniqueName: \"kubernetes.io/projected/a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95-kube-api-access-srsfc\") pod \"designate-operator-controller-manager-955677c94-dzjch\" (UID: \"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.896547 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-z4g9b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.903632 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.904894 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.918008 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mx4m5" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.929645 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.930788 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.941067 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948381 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsn8h\" (UniqueName: \"kubernetes.io/projected/c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a-kube-api-access-lsn8h\") pod \"horizon-operator-controller-manager-5d494799bf-nnj6l\" (UID: \"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l599k\" (UniqueName: \"kubernetes.io/projected/5884eab6-e3c0-45de-b93d-73392533b780-kube-api-access-l599k\") pod \"heat-operator-controller-manager-5b77f656f-zlt7m\" (UID: \"5884eab6-e3c0-45de-b93d-73392533b780\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948481 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kp2j\" (UniqueName: \"kubernetes.io/projected/9366c02a-e022-47e4-86c2-35d1e9a54cf4-kube-api-access-5kp2j\") pod \"ironic-operator-controller-manager-67cb4dc6d4-42dmk\" (UID: \"9366c02a-e022-47e4-86c2-35d1e9a54cf4\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948511 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4djlk\" (UniqueName: \"kubernetes.io/projected/736ef0f4-e471-4acd-8569-2a6d6d260f67-kube-api-access-4djlk\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.948616 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq5d9\" (UniqueName: \"kubernetes.io/projected/7c8dd8cc-7be7-41f9-ac93-139dc9e83274-kube-api-access-gq5d9\") pod \"glance-operator-controller-manager-589cbd6b5b-8ghg2\" (UID: \"7c8dd8cc-7be7-41f9-ac93-139dc9e83274\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:55:50 crc kubenswrapper[4954]: E1127 16:55:50.949326 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:50 crc kubenswrapper[4954]: E1127 16:55:50.949389 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:51.449370084 +0000 UTC m=+1063.466810384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.960689 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.962622 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.980561 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2"] Nov 27 16:55:50 crc kubenswrapper[4954]: I1127 16:55:50.982633 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.007769 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djlk\" (UniqueName: \"kubernetes.io/projected/736ef0f4-e471-4acd-8569-2a6d6d260f67-kube-api-access-4djlk\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.008930 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ctzjl" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.014616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq5d9\" (UniqueName: \"kubernetes.io/projected/7c8dd8cc-7be7-41f9-ac93-139dc9e83274-kube-api-access-gq5d9\") pod \"glance-operator-controller-manager-589cbd6b5b-8ghg2\" (UID: \"7c8dd8cc-7be7-41f9-ac93-139dc9e83274\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.019664 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.023980 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l599k\" (UniqueName: \"kubernetes.io/projected/5884eab6-e3c0-45de-b93d-73392533b780-kube-api-access-l599k\") pod \"heat-operator-controller-manager-5b77f656f-zlt7m\" (UID: \"5884eab6-e3c0-45de-b93d-73392533b780\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.026769 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsn8h\" (UniqueName: \"kubernetes.io/projected/c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a-kube-api-access-lsn8h\") pod \"horizon-operator-controller-manager-5d494799bf-nnj6l\" (UID: \"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.036418 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.041997 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.042666 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.054678 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5frkk" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.056440 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6rh\" (UniqueName: \"kubernetes.io/projected/ff3108ae-4629-448b-80d3-949e631c60d8-kube-api-access-dd6rh\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5vqr2\" (UID: \"ff3108ae-4629-448b-80d3-949e631c60d8\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.057190 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44l7l\" (UniqueName: \"kubernetes.io/projected/cc869191-7d3d-4192-bf48-a48625bff6ff-kube-api-access-44l7l\") pod \"keystone-operator-controller-manager-7b4567c7cf-bw2j9\" (UID: \"cc869191-7d3d-4192-bf48-a48625bff6ff\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.057237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8r4\" (UniqueName: \"kubernetes.io/projected/6dbcc715-b375-4776-87ff-4c5ecad80975-kube-api-access-qx8r4\") pod \"manila-operator-controller-manager-5d499bf58b-2jpwm\" (UID: \"6dbcc715-b375-4776-87ff-4c5ecad80975\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.062406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kp2j\" (UniqueName: \"kubernetes.io/projected/9366c02a-e022-47e4-86c2-35d1e9a54cf4-kube-api-access-5kp2j\") pod \"ironic-operator-controller-manager-67cb4dc6d4-42dmk\" (UID: \"9366c02a-e022-47e4-86c2-35d1e9a54cf4\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.073231 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.084117 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kp2j\" (UniqueName: \"kubernetes.io/projected/9366c02a-e022-47e4-86c2-35d1e9a54cf4-kube-api-access-5kp2j\") pod \"ironic-operator-controller-manager-67cb4dc6d4-42dmk\" (UID: \"9366c02a-e022-47e4-86c2-35d1e9a54cf4\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.106430 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.108254 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.109435 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.110870 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.111646 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qmllb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.112456 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9t67h" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.134000 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.141187 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.160061 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.164521 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6rh\" (UniqueName: \"kubernetes.io/projected/ff3108ae-4629-448b-80d3-949e631c60d8-kube-api-access-dd6rh\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5vqr2\" (UID: \"ff3108ae-4629-448b-80d3-949e631c60d8\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.165327 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kk8\" (UniqueName: \"kubernetes.io/projected/56f35029-dbcb-437a-94ed-3eac63c5145c-kube-api-access-62kk8\") pod \"neutron-operator-controller-manager-6fdcddb789-4g8kb\" (UID: \"56f35029-dbcb-437a-94ed-3eac63c5145c\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.165407 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44l7l\" (UniqueName: \"kubernetes.io/projected/cc869191-7d3d-4192-bf48-a48625bff6ff-kube-api-access-44l7l\") pod \"keystone-operator-controller-manager-7b4567c7cf-bw2j9\" (UID: \"cc869191-7d3d-4192-bf48-a48625bff6ff\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.165441 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8r4\" (UniqueName: \"kubernetes.io/projected/6dbcc715-b375-4776-87ff-4c5ecad80975-kube-api-access-qx8r4\") pod \"manila-operator-controller-manager-5d499bf58b-2jpwm\" (UID: \"6dbcc715-b375-4776-87ff-4c5ecad80975\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.175358 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.190661 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.197410 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.197539 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.202828 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-94b6w" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.207759 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.215752 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6rh\" (UniqueName: \"kubernetes.io/projected/ff3108ae-4629-448b-80d3-949e631c60d8-kube-api-access-dd6rh\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5vqr2\" (UID: \"ff3108ae-4629-448b-80d3-949e631c60d8\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.216259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44l7l\" (UniqueName: \"kubernetes.io/projected/cc869191-7d3d-4192-bf48-a48625bff6ff-kube-api-access-44l7l\") pod \"keystone-operator-controller-manager-7b4567c7cf-bw2j9\" (UID: \"cc869191-7d3d-4192-bf48-a48625bff6ff\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.218977 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.219120 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.219857 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.221336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8r4\" (UniqueName: \"kubernetes.io/projected/6dbcc715-b375-4776-87ff-4c5ecad80975-kube-api-access-qx8r4\") pod \"manila-operator-controller-manager-5d499bf58b-2jpwm\" (UID: \"6dbcc715-b375-4776-87ff-4c5ecad80975\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.221527 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.222211 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-z2lwf" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.222750 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.225304 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tk85s" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.235298 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.247067 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.253886 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-mmr72"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.255230 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.258473 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-p66lf" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.267206 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62kk8\" (UniqueName: \"kubernetes.io/projected/56f35029-dbcb-437a-94ed-3eac63c5145c-kube-api-access-62kk8\") pod \"neutron-operator-controller-manager-6fdcddb789-4g8kb\" (UID: \"56f35029-dbcb-437a-94ed-3eac63c5145c\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.267292 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gplr\" (UniqueName: \"kubernetes.io/projected/770db406-d44c-490f-8409-f5b3e8f66145-kube-api-access-5gplr\") pod \"octavia-operator-controller-manager-64cdc6ff96-9pwxb\" (UID: \"770db406-d44c-490f-8409-f5b3e8f66145\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.267351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nh4n\" (UniqueName: \"kubernetes.io/projected/b4fb4c16-8870-494e-a075-ee70d251da46-kube-api-access-6nh4n\") pod \"nova-operator-controller-manager-79556f57fc-p55vw\" (UID: \"b4fb4c16-8870-494e-a075-ee70d251da46\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.272372 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-mmr72"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.280975 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.282327 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.284805 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t2j5g" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.288642 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.292382 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kk8\" (UniqueName: \"kubernetes.io/projected/56f35029-dbcb-437a-94ed-3eac63c5145c-kube-api-access-62kk8\") pod \"neutron-operator-controller-manager-6fdcddb789-4g8kb\" (UID: \"56f35029-dbcb-437a-94ed-3eac63c5145c\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.310527 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.313303 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.317259 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5pgh6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.319660 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.335325 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.343345 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.344729 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.351657 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.352821 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ctjhh" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.367574 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.377321 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.377444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nh4n\" (UniqueName: \"kubernetes.io/projected/b4fb4c16-8870-494e-a075-ee70d251da46-kube-api-access-6nh4n\") pod \"nova-operator-controller-manager-79556f57fc-p55vw\" (UID: \"b4fb4c16-8870-494e-a075-ee70d251da46\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.377484 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7dz\" (UniqueName: \"kubernetes.io/projected/73b53349-7e1d-499f-918e-e25598787e70-kube-api-access-6t7dz\") pod \"placement-operator-controller-manager-57988cc5b5-nv8bz\" (UID: \"73b53349-7e1d-499f-918e-e25598787e70\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.377567 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgbm\" (UniqueName: \"kubernetes.io/projected/cbeef148-5a6f-4738-83f0-eae93d81bae3-kube-api-access-vdgbm\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.383875 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gplr\" (UniqueName: \"kubernetes.io/projected/770db406-d44c-490f-8409-f5b3e8f66145-kube-api-access-5gplr\") pod \"octavia-operator-controller-manager-64cdc6ff96-9pwxb\" (UID: \"770db406-d44c-490f-8409-f5b3e8f66145\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.383925 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4t7\" (UniqueName: \"kubernetes.io/projected/376db7a5-650f-4327-8e03-2f2be98969a0-kube-api-access-gw4t7\") pod \"swift-operator-controller-manager-d77b94747-mmr72\" (UID: \"376db7a5-650f-4327-8e03-2f2be98969a0\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.383968 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6ms\" (UniqueName: \"kubernetes.io/projected/523e3a36-bc9e-4698-af7d-e7ecd3b7a740-kube-api-access-bg6ms\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wr8t4\" (UID: \"523e3a36-bc9e-4698-af7d-e7ecd3b7a740\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.383991 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdtt\" (UniqueName: \"kubernetes.io/projected/7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0-kube-api-access-jqdtt\") pod \"ovn-operator-controller-manager-56897c768d-mln9c\" (UID: \"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.384564 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.406245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.416802 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nh4n\" (UniqueName: \"kubernetes.io/projected/b4fb4c16-8870-494e-a075-ee70d251da46-kube-api-access-6nh4n\") pod \"nova-operator-controller-manager-79556f57fc-p55vw\" (UID: \"b4fb4c16-8870-494e-a075-ee70d251da46\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.428773 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gplr\" (UniqueName: \"kubernetes.io/projected/770db406-d44c-490f-8409-f5b3e8f66145-kube-api-access-5gplr\") pod \"octavia-operator-controller-manager-64cdc6ff96-9pwxb\" (UID: \"770db406-d44c-490f-8409-f5b3e8f66145\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.446437 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.456206 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.480699 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.483690 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487273 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4t7\" (UniqueName: \"kubernetes.io/projected/376db7a5-650f-4327-8e03-2f2be98969a0-kube-api-access-gw4t7\") pod \"swift-operator-controller-manager-d77b94747-mmr72\" (UID: \"376db7a5-650f-4327-8e03-2f2be98969a0\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487415 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6ms\" (UniqueName: \"kubernetes.io/projected/523e3a36-bc9e-4698-af7d-e7ecd3b7a740-kube-api-access-bg6ms\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wr8t4\" (UID: \"523e3a36-bc9e-4698-af7d-e7ecd3b7a740\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487456 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdtt\" (UniqueName: \"kubernetes.io/projected/7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0-kube-api-access-jqdtt\") pod \"ovn-operator-controller-manager-56897c768d-mln9c\" (UID: \"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487533 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487609 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzksd\" (UniqueName: \"kubernetes.io/projected/6a00b9f9-d61f-411d-897d-496d8c8b3501-kube-api-access-jzksd\") pod \"watcher-operator-controller-manager-656dcb59d4-wg8x7\" (UID: \"6a00b9f9-d61f-411d-897d-496d8c8b3501\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487647 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7dz\" (UniqueName: \"kubernetes.io/projected/73b53349-7e1d-499f-918e-e25598787e70-kube-api-access-6t7dz\") pod \"placement-operator-controller-manager-57988cc5b5-nv8bz\" (UID: \"73b53349-7e1d-499f-918e-e25598787e70\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgbm\" (UniqueName: \"kubernetes.io/projected/cbeef148-5a6f-4738-83f0-eae93d81bae3-kube-api-access-vdgbm\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.487895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.488141 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78w4d\" (UniqueName: \"kubernetes.io/projected/146450d6-91cc-4600-9712-449fcf5328b2-kube-api-access-78w4d\") pod \"test-operator-controller-manager-5cd6c7f4c8-7dmz6\" (UID: \"146450d6-91cc-4600-9712-449fcf5328b2\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.488653 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.489058 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.489133 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.489168 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert podName:cbeef148-5a6f-4738-83f0-eae93d81bae3 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:51.989133871 +0000 UTC m=+1064.006574171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" (UID: "cbeef148-5a6f-4738-83f0-eae93d81bae3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.489204 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:52.489177882 +0000 UTC m=+1064.506618182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.490140 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gxwk2" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.490369 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.504903 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6ms\" (UniqueName: \"kubernetes.io/projected/523e3a36-bc9e-4698-af7d-e7ecd3b7a740-kube-api-access-bg6ms\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wr8t4\" (UID: \"523e3a36-bc9e-4698-af7d-e7ecd3b7a740\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.519561 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7dz\" (UniqueName: \"kubernetes.io/projected/73b53349-7e1d-499f-918e-e25598787e70-kube-api-access-6t7dz\") pod \"placement-operator-controller-manager-57988cc5b5-nv8bz\" (UID: \"73b53349-7e1d-499f-918e-e25598787e70\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.521347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgbm\" (UniqueName: \"kubernetes.io/projected/cbeef148-5a6f-4738-83f0-eae93d81bae3-kube-api-access-vdgbm\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.522733 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdtt\" (UniqueName: \"kubernetes.io/projected/7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0-kube-api-access-jqdtt\") pod \"ovn-operator-controller-manager-56897c768d-mln9c\" (UID: \"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.523600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4t7\" (UniqueName: \"kubernetes.io/projected/376db7a5-650f-4327-8e03-2f2be98969a0-kube-api-access-gw4t7\") pod \"swift-operator-controller-manager-d77b94747-mmr72\" (UID: \"376db7a5-650f-4327-8e03-2f2be98969a0\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.524696 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.573445 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.574863 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.577356 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fdsfq" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.586335 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.597967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.598146 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78w4d\" (UniqueName: \"kubernetes.io/projected/146450d6-91cc-4600-9712-449fcf5328b2-kube-api-access-78w4d\") pod \"test-operator-controller-manager-5cd6c7f4c8-7dmz6\" (UID: \"146450d6-91cc-4600-9712-449fcf5328b2\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.598243 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.598356 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swcn9\" (UniqueName: \"kubernetes.io/projected/7eefae7c-fef6-47b3-8f89-4856b6ae1980-kube-api-access-swcn9\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.598465 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzksd\" (UniqueName: \"kubernetes.io/projected/6a00b9f9-d61f-411d-897d-496d8c8b3501-kube-api-access-jzksd\") pod \"watcher-operator-controller-manager-656dcb59d4-wg8x7\" (UID: \"6a00b9f9-d61f-411d-897d-496d8c8b3501\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.610862 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.617993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78w4d\" (UniqueName: \"kubernetes.io/projected/146450d6-91cc-4600-9712-449fcf5328b2-kube-api-access-78w4d\") pod \"test-operator-controller-manager-5cd6c7f4c8-7dmz6\" (UID: \"146450d6-91cc-4600-9712-449fcf5328b2\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.630034 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzksd\" (UniqueName: \"kubernetes.io/projected/6a00b9f9-d61f-411d-897d-496d8c8b3501-kube-api-access-jzksd\") pod \"watcher-operator-controller-manager-656dcb59d4-wg8x7\" (UID: \"6a00b9f9-d61f-411d-897d-496d8c8b3501\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.669323 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.695165 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.700073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnl2\" (UniqueName: \"kubernetes.io/projected/8fad5f5d-c6a2-497f-8524-1ae501d6a444-kube-api-access-9mnl2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvk89\" (UID: \"8fad5f5d-c6a2-497f-8524-1ae501d6a444\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.700118 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.700159 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swcn9\" (UniqueName: \"kubernetes.io/projected/7eefae7c-fef6-47b3-8f89-4856b6ae1980-kube-api-access-swcn9\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.700219 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.700386 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.700637 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:52.200616334 +0000 UTC m=+1064.218056634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.700833 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: E1127 16:55:51.700956 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:52.200932362 +0000 UTC m=+1064.218372762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "webhook-server-cert" not found Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.705447 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.719513 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swcn9\" (UniqueName: \"kubernetes.io/projected/7eefae7c-fef6-47b3-8f89-4856b6ae1980-kube-api-access-swcn9\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.724039 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.753035 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.754855 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.801272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnl2\" (UniqueName: \"kubernetes.io/projected/8fad5f5d-c6a2-497f-8524-1ae501d6a444-kube-api-access-9mnl2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvk89\" (UID: \"8fad5f5d-c6a2-497f-8524-1ae501d6a444\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.834116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnl2\" (UniqueName: \"kubernetes.io/projected/8fad5f5d-c6a2-497f-8524-1ae501d6a444-kube-api-access-9mnl2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvk89\" (UID: \"8fad5f5d-c6a2-497f-8524-1ae501d6a444\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.877515 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.903961 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.932490 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.942503 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2"] Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.984540 4954 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" secret="" err="failed to sync secret cache: timed out waiting for the condition" Nov 27 16:55:51 crc kubenswrapper[4954]: I1127 16:55:51.984645 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:55:51 crc kubenswrapper[4954]: W1127 16:55:51.990667 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8dd8cc_7be7_41f9_ac93_139dc9e83274.slice/crio-6d34a64f0676c605f5b969c62548482c0c1b045c6b1dd11b8bffd6bf5f2cbaa3 WatchSource:0}: Error finding container 6d34a64f0676c605f5b969c62548482c0c1b045c6b1dd11b8bffd6bf5f2cbaa3: Status 404 returned error can't find the container with id 6d34a64f0676c605f5b969c62548482c0c1b045c6b1dd11b8bffd6bf5f2cbaa3 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.000372 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.006001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.006179 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.006229 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert podName:cbeef148-5a6f-4738-83f0-eae93d81bae3 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:53.006212906 +0000 UTC m=+1065.023653206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" (UID: "cbeef148-5a6f-4738-83f0-eae93d81bae3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.072563 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5884eab6_e3c0_45de_b93d_73392533b780.slice/crio-b58c6706c2076842b94e5d71e9762ce9c29f41aa0b293c1d9c49143933d84c63 WatchSource:0}: Error finding container b58c6706c2076842b94e5d71e9762ce9c29f41aa0b293c1d9c49143933d84c63: Status 404 returned error can't find the container with id b58c6706c2076842b94e5d71e9762ce9c29f41aa0b293c1d9c49143933d84c63 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.077939 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4rgrq" Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.179245 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.197432 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.210613 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.211621 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.211708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.211891 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.211950 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:53.21192993 +0000 UTC m=+1065.229370230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.212127 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.212223 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:53.212196147 +0000 UTC m=+1065.229636447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.212785 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9366c02a_e022_47e4_86c2_35d1e9a54cf4.slice/crio-7d2cd8c5cea063b575df1aae4d5e69c2edf92fc8783ebd485368c06a74f45216 WatchSource:0}: Error finding container 7d2cd8c5cea063b575df1aae4d5e69c2edf92fc8783ebd485368c06a74f45216: Status 404 returned error can't find the container with id 7d2cd8c5cea063b575df1aae4d5e69c2edf92fc8783ebd485368c06a74f45216 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.220054 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.235907 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.250852 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" event={"ID":"50ec526e-d6db-45fa-8b99-bd795b4c3690","Type":"ContainerStarted","Data":"a24a119bd91e6105820e42609115cac5df43f426b5d379b6780dd407f8a219df"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.252598 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" event={"ID":"7c8dd8cc-7be7-41f9-ac93-139dc9e83274","Type":"ContainerStarted","Data":"6d34a64f0676c605f5b969c62548482c0c1b045c6b1dd11b8bffd6bf5f2cbaa3"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.254385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" event={"ID":"cc869191-7d3d-4192-bf48-a48625bff6ff","Type":"ContainerStarted","Data":"6a348355faff4962bf505bf3d6ce31d05ac8d6ef65b8c5a11df849c9ac08ffdd"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.257277 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" event={"ID":"770db406-d44c-490f-8409-f5b3e8f66145","Type":"ContainerStarted","Data":"6c5f6915ca8db7c20aca788f4986d43198970300c8a448fd68253498bee9474f"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.258699 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" event={"ID":"11ca1308-8c7a-4a3d-a283-2533abc54c25","Type":"ContainerStarted","Data":"77cca00ba5595095dc16202a06d478c0908563cb8192ffbd7990509bd5aa6d68"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.260043 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" event={"ID":"5884eab6-e3c0-45de-b93d-73392533b780","Type":"ContainerStarted","Data":"b58c6706c2076842b94e5d71e9762ce9c29f41aa0b293c1d9c49143933d84c63"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.261015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" event={"ID":"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a","Type":"ContainerStarted","Data":"37f231d2cb1d1b3f4cae26e7fc0f61879c3f26abf8b99db770464bf498c62e04"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.264816 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" event={"ID":"9366c02a-e022-47e4-86c2-35d1e9a54cf4","Type":"ContainerStarted","Data":"7d2cd8c5cea063b575df1aae4d5e69c2edf92fc8783ebd485368c06a74f45216"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.266084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" event={"ID":"ff3108ae-4629-448b-80d3-949e631c60d8","Type":"ContainerStarted","Data":"41378e1162e48d26d001520740f06bdb70934eeb975fde832632c8cd2ad646a3"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.267162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" event={"ID":"6dbcc715-b375-4776-87ff-4c5ecad80975","Type":"ContainerStarted","Data":"98527ad2646b5a1d881cb18d329b63bb0d39fc95cbe5aa066151894145538284"} Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.368528 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.372151 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f35029_dbcb_437a_94ed_3eac63c5145c.slice/crio-3049ebef1958b73b1fff5325c84eadbb219469cd5ee949ba07f0c553f6d15252 WatchSource:0}: Error finding container 3049ebef1958b73b1fff5325c84eadbb219469cd5ee949ba07f0c553f6d15252: Status 404 returned error can't find the container with id 3049ebef1958b73b1fff5325c84eadbb219469cd5ee949ba07f0c553f6d15252 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.417934 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.418131 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4fb4c16_8870_494e_a075_ee70d251da46.slice/crio-d1f092b51717e17b759b9b91beff39b0f704da6686d6557ce66f5774eeb6e423 WatchSource:0}: Error finding container d1f092b51717e17b759b9b91beff39b0f704da6686d6557ce66f5774eeb6e423: Status 404 returned error can't find the container with id d1f092b51717e17b759b9b91beff39b0f704da6686d6557ce66f5774eeb6e423 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.427683 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.428882 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcd119b_9cb2_48ab_ac2f_2f0b10d5b2f0.slice/crio-d19f08f22b417409db01b159e55ec1cc0c0893129b227d19da9241792c268f35 WatchSource:0}: Error finding container d19f08f22b417409db01b159e55ec1cc0c0893129b227d19da9241792c268f35: Status 404 returned error can't find the container with id d19f08f22b417409db01b159e55ec1cc0c0893129b227d19da9241792c268f35 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.501586 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.512824 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-mmr72"] Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.514874 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.515092 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.515167 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:54.515143345 +0000 UTC m=+1066.532583655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.517674 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.525966 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod376db7a5_650f_4327_8e03_2f2be98969a0.slice/crio-f551181eefd035b5f43d8450847a3cf867fa511c17cbd132fb6a2e9368014d77 WatchSource:0}: Error finding container f551181eefd035b5f43d8450847a3cf867fa511c17cbd132fb6a2e9368014d77: Status 404 returned error can't find the container with id f551181eefd035b5f43d8450847a3cf867fa511c17cbd132fb6a2e9368014d77 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.526860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7"] Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.531963 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bg6ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-wr8t4_openstack-operators(523e3a36-bc9e-4698-af7d-e7ecd3b7a740): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.532204 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146450d6_91cc_4600_9712_449fcf5328b2.slice/crio-00d8a52a043b02b6a7c8f5313dd5f09e8e5783014c78d80009863a5ff8126821 WatchSource:0}: Error finding container 00d8a52a043b02b6a7c8f5313dd5f09e8e5783014c78d80009863a5ff8126821: Status 404 returned error can't find the container with id 00d8a52a043b02b6a7c8f5313dd5f09e8e5783014c78d80009863a5ff8126821 Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.532771 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b53349_7e1d_499f_918e_e25598787e70.slice/crio-a94bf7601fc114120fe9ddf9288d8b5357a648eeb9068b349e015c7c42d85ad8 WatchSource:0}: Error finding container a94bf7601fc114120fe9ddf9288d8b5357a648eeb9068b349e015c7c42d85ad8: Status 404 returned error can't find the container with id a94bf7601fc114120fe9ddf9288d8b5357a648eeb9068b349e015c7c42d85ad8 Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.533403 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gw4t7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-mmr72_openstack-operators(376db7a5-650f-4327-8e03-2f2be98969a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.533673 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bg6ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-wr8t4_openstack-operators(523e3a36-bc9e-4698-af7d-e7ecd3b7a740): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.534831 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" podUID="523e3a36-bc9e-4698-af7d-e7ecd3b7a740" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.535823 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78w4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-7dmz6_openstack-operators(146450d6-91cc-4600-9712-449fcf5328b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.536193 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jzksd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-wg8x7_openstack-operators(6a00b9f9-d61f-411d-897d-496d8c8b3501): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.537672 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gw4t7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-mmr72_openstack-operators(376db7a5-650f-4327-8e03-2f2be98969a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.538554 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78w4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-7dmz6_openstack-operators(146450d6-91cc-4600-9712-449fcf5328b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.538711 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jzksd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-wg8x7_openstack-operators(6a00b9f9-d61f-411d-897d-496d8c8b3501): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.538769 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" podUID="376db7a5-650f-4327-8e03-2f2be98969a0" Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.538785 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz"] Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.538856 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6t7dz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-nv8bz_openstack-operators(73b53349-7e1d-499f-918e-e25598787e70): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.540117 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" podUID="6a00b9f9-d61f-411d-897d-496d8c8b3501" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.540248 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" podUID="146450d6-91cc-4600-9712-449fcf5328b2" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.542184 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6t7dz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-nv8bz_openstack-operators(73b53349-7e1d-499f-918e-e25598787e70): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.543332 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" podUID="73b53349-7e1d-499f-918e-e25598787e70" Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.652433 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.658292 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fad5f5d_c6a2_497f_8524_1ae501d6a444.slice/crio-a0db66bd7ad8999f933b8bce517ac9f8bd8c4e97d0a0190e26c9dacf4f3d1501 WatchSource:0}: Error finding container a0db66bd7ad8999f933b8bce517ac9f8bd8c4e97d0a0190e26c9dacf4f3d1501: Status 404 returned error can't find the container with id a0db66bd7ad8999f933b8bce517ac9f8bd8c4e97d0a0190e26c9dacf4f3d1501 Nov 27 16:55:52 crc kubenswrapper[4954]: I1127 16:55:52.706692 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-dzjch"] Nov 27 16:55:52 crc kubenswrapper[4954]: W1127 16:55:52.710403 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09ff3fd_b10f_421c_a3a5_aa7dc4dcff95.slice/crio-c9131c98aa39fbcca66c2114504a1766687d67da272352542756f9562a7bbb86 WatchSource:0}: Error finding container c9131c98aa39fbcca66c2114504a1766687d67da272352542756f9562a7bbb86: Status 404 returned error can't find the container with id c9131c98aa39fbcca66c2114504a1766687d67da272352542756f9562a7bbb86 Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.713154 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:ec4e5c911c1d0f1ea211a04b251a9d2e95b69d141c1caf07a0381693b2d6368b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srsfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-955677c94-dzjch_openstack-operators(a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.715004 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srsfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-955677c94-dzjch_openstack-operators(a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4954]: E1127 16:55:52.716151 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" podUID="a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.024017 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.024187 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.024259 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert podName:cbeef148-5a6f-4738-83f0-eae93d81bae3 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:55.024238197 +0000 UTC m=+1067.041678497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" (UID: "cbeef148-5a6f-4738-83f0-eae93d81bae3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.228159 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.228250 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.228305 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.228349 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.228369 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:55.22834962 +0000 UTC m=+1067.245789920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.228384 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:55.228379101 +0000 UTC m=+1067.245819401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "webhook-server-cert" not found Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.278012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" event={"ID":"73b53349-7e1d-499f-918e-e25598787e70","Type":"ContainerStarted","Data":"a94bf7601fc114120fe9ddf9288d8b5357a648eeb9068b349e015c7c42d85ad8"} Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.280766 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" podUID="73b53349-7e1d-499f-918e-e25598787e70" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.280774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" event={"ID":"376db7a5-650f-4327-8e03-2f2be98969a0","Type":"ContainerStarted","Data":"f551181eefd035b5f43d8450847a3cf867fa511c17cbd132fb6a2e9368014d77"} Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.284641 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" event={"ID":"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95","Type":"ContainerStarted","Data":"c9131c98aa39fbcca66c2114504a1766687d67da272352542756f9562a7bbb86"} Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.287233 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" event={"ID":"b4fb4c16-8870-494e-a075-ee70d251da46","Type":"ContainerStarted","Data":"d1f092b51717e17b759b9b91beff39b0f704da6686d6557ce66f5774eeb6e423"} Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.287898 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" podUID="376db7a5-650f-4327-8e03-2f2be98969a0" Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.288907 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:ec4e5c911c1d0f1ea211a04b251a9d2e95b69d141c1caf07a0381693b2d6368b\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" podUID="a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.289431 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" event={"ID":"6a00b9f9-d61f-411d-897d-496d8c8b3501","Type":"ContainerStarted","Data":"1acb5c4ba04b0d54b14ae7450ad719600e86c585cd631ea896b2a98ae74b70b1"} Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.292694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" event={"ID":"8fad5f5d-c6a2-497f-8524-1ae501d6a444","Type":"ContainerStarted","Data":"a0db66bd7ad8999f933b8bce517ac9f8bd8c4e97d0a0190e26c9dacf4f3d1501"} Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.294962 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" podUID="6a00b9f9-d61f-411d-897d-496d8c8b3501" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.303376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" event={"ID":"56f35029-dbcb-437a-94ed-3eac63c5145c","Type":"ContainerStarted","Data":"3049ebef1958b73b1fff5325c84eadbb219469cd5ee949ba07f0c553f6d15252"} Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.305501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" event={"ID":"523e3a36-bc9e-4698-af7d-e7ecd3b7a740","Type":"ContainerStarted","Data":"3c47842bb9f8d0e3aa33a6f89e1cd36cd3793af5672186db2692d9106e3cfe59"} Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.307557 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" event={"ID":"146450d6-91cc-4600-9712-449fcf5328b2","Type":"ContainerStarted","Data":"00d8a52a043b02b6a7c8f5313dd5f09e8e5783014c78d80009863a5ff8126821"} Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.311883 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" podUID="523e3a36-bc9e-4698-af7d-e7ecd3b7a740" Nov 27 16:55:53 crc kubenswrapper[4954]: I1127 16:55:53.315035 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" event={"ID":"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0","Type":"ContainerStarted","Data":"d19f08f22b417409db01b159e55ec1cc0c0893129b227d19da9241792c268f35"} Nov 27 16:55:53 crc kubenswrapper[4954]: E1127 16:55:53.315682 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" podUID="146450d6-91cc-4600-9712-449fcf5328b2" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.349337 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" podUID="376db7a5-650f-4327-8e03-2f2be98969a0" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.350156 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:ec4e5c911c1d0f1ea211a04b251a9d2e95b69d141c1caf07a0381693b2d6368b\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" podUID="a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.350223 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" podUID="6a00b9f9-d61f-411d-897d-496d8c8b3501" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.350279 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" podUID="146450d6-91cc-4600-9712-449fcf5328b2" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.351304 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" podUID="523e3a36-bc9e-4698-af7d-e7ecd3b7a740" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.351565 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" podUID="73b53349-7e1d-499f-918e-e25598787e70" Nov 27 16:55:54 crc kubenswrapper[4954]: I1127 16:55:54.552535 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.552817 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:54 crc kubenswrapper[4954]: E1127 16:55:54.552979 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:58.552929665 +0000 UTC m=+1070.570369965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: I1127 16:55:55.059486 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.059721 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.060226 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert podName:cbeef148-5a6f-4738-83f0-eae93d81bae3 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:59.060192803 +0000 UTC m=+1071.077633103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" (UID: "cbeef148-5a6f-4738-83f0-eae93d81bae3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: I1127 16:55:55.262899 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:55 crc kubenswrapper[4954]: I1127 16:55:55.263043 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.263080 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.263158 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:59.263137559 +0000 UTC m=+1071.280577859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.263188 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:55:55 crc kubenswrapper[4954]: E1127 16:55:55.263260 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:55:59.263240832 +0000 UTC m=+1071.280681132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "webhook-server-cert" not found Nov 27 16:55:58 crc kubenswrapper[4954]: I1127 16:55:58.624109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:55:58 crc kubenswrapper[4954]: E1127 16:55:58.624325 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:58 crc kubenswrapper[4954]: E1127 16:55:58.624984 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:06.624952952 +0000 UTC m=+1078.642393282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: I1127 16:55:59.135250 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.135450 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.135528 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert podName:cbeef148-5a6f-4738-83f0-eae93d81bae3 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:07.135504649 +0000 UTC m=+1079.152944949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" (UID: "cbeef148-5a6f-4738-83f0-eae93d81bae3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: I1127 16:55:59.338378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:59 crc kubenswrapper[4954]: I1127 16:55:59.338560 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.338700 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.338829 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.338857 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:07.338822964 +0000 UTC m=+1079.356263294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "webhook-server-cert" not found Nov 27 16:55:59 crc kubenswrapper[4954]: E1127 16:55:59.338922 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:07.338889816 +0000 UTC m=+1079.356330316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:56:06 crc kubenswrapper[4954]: I1127 16:56:06.667336 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:06 crc kubenswrapper[4954]: E1127 16:56:06.667960 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:56:06 crc kubenswrapper[4954]: E1127 16:56:06.668010 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert podName:736ef0f4-e471-4acd-8569-2a6d6d260f67 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:22.667994567 +0000 UTC m=+1094.685434867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert") pod "infra-operator-controller-manager-57548d458d-4vpsc" (UID: "736ef0f4-e471-4acd-8569-2a6d6d260f67") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.178810 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.192362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbeef148-5a6f-4738-83f0-eae93d81bae3-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb\" (UID: \"cbeef148-5a6f-4738-83f0-eae93d81bae3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.241327 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.382407 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.382568 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:07 crc kubenswrapper[4954]: E1127 16:56:07.382707 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:56:07 crc kubenswrapper[4954]: E1127 16:56:07.382864 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs podName:7eefae7c-fef6-47b3-8f89-4856b6ae1980 nodeName:}" failed. No retries permitted until 2025-11-27 16:56:23.382820813 +0000 UTC m=+1095.400261143 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs") pod "openstack-operator-controller-manager-556d4f4767-6wqxx" (UID: "7eefae7c-fef6-47b3-8f89-4856b6ae1980") : secret "metrics-server-cert" not found Nov 27 16:56:07 crc kubenswrapper[4954]: I1127 16:56:07.389938 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-webhook-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:11 crc kubenswrapper[4954]: E1127 16:56:11.233114 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711" Nov 27 16:56:11 crc kubenswrapper[4954]: E1127 16:56:11.233628 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-44l7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7b4567c7cf-bw2j9_openstack-operators(cc869191-7d3d-4192-bf48-a48625bff6ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:56:11 crc kubenswrapper[4954]: E1127 16:56:11.683715 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 27 16:56:11 crc kubenswrapper[4954]: E1127 16:56:11.683924 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mnl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xvk89_openstack-operators(8fad5f5d-c6a2-497f-8524-1ae501d6a444): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:56:11 crc kubenswrapper[4954]: E1127 16:56:11.685238 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" podUID="8fad5f5d-c6a2-497f-8524-1ae501d6a444" Nov 27 16:56:12 crc kubenswrapper[4954]: E1127 16:56:12.480064 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" podUID="8fad5f5d-c6a2-497f-8524-1ae501d6a444" Nov 27 16:56:14 crc kubenswrapper[4954]: I1127 16:56:14.813310 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb"] Nov 27 16:56:14 crc kubenswrapper[4954]: W1127 16:56:14.976065 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbeef148_5a6f_4738_83f0_eae93d81bae3.slice/crio-a6ad2127455f16bbd42d921ae7d4cd7692e870ccefc9f19cecd350d0338e2345 WatchSource:0}: Error finding container a6ad2127455f16bbd42d921ae7d4cd7692e870ccefc9f19cecd350d0338e2345: Status 404 returned error can't find the container with id a6ad2127455f16bbd42d921ae7d4cd7692e870ccefc9f19cecd350d0338e2345 Nov 27 16:56:15 crc kubenswrapper[4954]: I1127 16:56:15.501663 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" event={"ID":"cbeef148-5a6f-4738-83f0-eae93d81bae3","Type":"ContainerStarted","Data":"a6ad2127455f16bbd42d921ae7d4cd7692e870ccefc9f19cecd350d0338e2345"} Nov 27 16:56:15 crc kubenswrapper[4954]: I1127 16:56:15.506038 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" event={"ID":"9366c02a-e022-47e4-86c2-35d1e9a54cf4","Type":"ContainerStarted","Data":"ed07de3aa0c8eb28fc423701583d745bb5cdaca585992dffb18068515de5e29e"} Nov 27 16:56:15 crc kubenswrapper[4954]: I1127 16:56:15.508729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" event={"ID":"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0","Type":"ContainerStarted","Data":"91f5db28dce8098d50c586fd3885a898e916e691a067f8a1e04e7f6d390bf13d"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.537798 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" event={"ID":"7c8dd8cc-7be7-41f9-ac93-139dc9e83274","Type":"ContainerStarted","Data":"15e63e3bb0976687dca71632736f3fc528595a737d30c201afc3b27edbc74f1a"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.559054 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" event={"ID":"770db406-d44c-490f-8409-f5b3e8f66145","Type":"ContainerStarted","Data":"3d426071f6961b786419e3445e7b51397f2209aeadb24439d27528bf9097e857"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.589141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" event={"ID":"ff3108ae-4629-448b-80d3-949e631c60d8","Type":"ContainerStarted","Data":"48e7bdf67273d4a6e8b31d3fc21b484474b0af64e8ed9b005d73dde77859e79e"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.598722 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" event={"ID":"11ca1308-8c7a-4a3d-a283-2533abc54c25","Type":"ContainerStarted","Data":"c14df446c3f32d4cb0139108cb7e6e52c44a48822e76e05757d3a9d90ec5413a"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.606428 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" event={"ID":"56f35029-dbcb-437a-94ed-3eac63c5145c","Type":"ContainerStarted","Data":"8c2e2c8ad15d3d2fbbeb138db75c809a8ab58886574721038cffffc1b7853a38"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.612566 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" event={"ID":"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a","Type":"ContainerStarted","Data":"8778033affe699c92c93c6ceb72705896d05c8ecac5a8d3b9350a1e20a7e42f2"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.613515 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" event={"ID":"5884eab6-e3c0-45de-b93d-73392533b780","Type":"ContainerStarted","Data":"11583a85be36b02fbb5690dd0dcb990764b44b3fc0d8ab7c55a2c72f92ae0a1b"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.614568 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" event={"ID":"50ec526e-d6db-45fa-8b99-bd795b4c3690","Type":"ContainerStarted","Data":"d45d7ad6183701c7dbf2a37d6b800dce9f585df15f85d6695e655332134f54f7"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.623835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" event={"ID":"b4fb4c16-8870-494e-a075-ee70d251da46","Type":"ContainerStarted","Data":"139cb6d0b2dfa1926d9fe302a4594368951cbf51d9606aa82f84a37eda1c27f2"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.654145 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" event={"ID":"523e3a36-bc9e-4698-af7d-e7ecd3b7a740","Type":"ContainerStarted","Data":"1fa19e21e1c597847a80aa79a9361df2cdf9efe6040f0f542b34492267648dd8"} Nov 27 16:56:16 crc kubenswrapper[4954]: I1127 16:56:16.703158 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" event={"ID":"6dbcc715-b375-4776-87ff-4c5ecad80975","Type":"ContainerStarted","Data":"b1f09dc147490288cf16a5aea43a70305553dc6b05e22b1ea7b413c15794e913"} Nov 27 16:56:17 crc kubenswrapper[4954]: I1127 16:56:17.696154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" event={"ID":"73b53349-7e1d-499f-918e-e25598787e70","Type":"ContainerStarted","Data":"739770daf6ca5d2ec5093436bf7b01190b18a8510ae4dff4f885cda308b4a1af"} Nov 27 16:56:18 crc kubenswrapper[4954]: I1127 16:56:18.715371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" event={"ID":"6a00b9f9-d61f-411d-897d-496d8c8b3501","Type":"ContainerStarted","Data":"eee28ccb73cca0fae7282d8b66e88eaa610021122ba8b4d01c741d8b6f382a31"} Nov 27 16:56:19 crc kubenswrapper[4954]: E1127 16:56:19.117132 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" podUID="cc869191-7d3d-4192-bf48-a48625bff6ff" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.733122 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" event={"ID":"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95","Type":"ContainerStarted","Data":"0e48fc3f31657ad8c2b66be4983ecf1d71f382052d51076deb416bb10b5ce42a"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.752470 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" event={"ID":"cbeef148-5a6f-4738-83f0-eae93d81bae3","Type":"ContainerStarted","Data":"861dd4931622c6b6676fa269064ad8cdb5c8209dfb12df361b661f7221eb484a"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.755480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" event={"ID":"9366c02a-e022-47e4-86c2-35d1e9a54cf4","Type":"ContainerStarted","Data":"a524be98c948498d6cd47354817fb9312a2d5446c2da1d7e15f11983a820a352"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.756425 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.788017 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" podStartSLOduration=3.192295875 podStartE2EDuration="29.787994281s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.217230649 +0000 UTC m=+1064.234670949" lastFinishedPulling="2025-11-27 16:56:18.812929055 +0000 UTC m=+1090.830369355" observedRunningTime="2025-11-27 16:56:19.783822179 +0000 UTC m=+1091.801262479" watchObservedRunningTime="2025-11-27 16:56:19.787994281 +0000 UTC m=+1091.805434581" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.794269 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" event={"ID":"523e3a36-bc9e-4698-af7d-e7ecd3b7a740","Type":"ContainerStarted","Data":"747a23fcff22b1d507156c7473acb439edd9a9f9b576bc0bd7330e5ec9b81a62"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.795605 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.828650 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" event={"ID":"146450d6-91cc-4600-9712-449fcf5328b2","Type":"ContainerStarted","Data":"2b68fd35d26924ef318eaf87dba80b416d17f6131d79272c74ecfbbb9d21954f"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.828713 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" event={"ID":"146450d6-91cc-4600-9712-449fcf5328b2","Type":"ContainerStarted","Data":"061399a0252c40c04e9b2926006df4c53b61bd0d074b74e345882176f2f2cf4c"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.830287 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.844106 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" podStartSLOduration=2.517651727 podStartE2EDuration="28.844080275s" podCreationTimestamp="2025-11-27 16:55:51 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.53180759 +0000 UTC m=+1064.549247890" lastFinishedPulling="2025-11-27 16:56:18.858236138 +0000 UTC m=+1090.875676438" observedRunningTime="2025-11-27 16:56:19.825546353 +0000 UTC m=+1091.842986653" watchObservedRunningTime="2025-11-27 16:56:19.844080275 +0000 UTC m=+1091.861520575" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.862943 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" event={"ID":"5884eab6-e3c0-45de-b93d-73392533b780","Type":"ContainerStarted","Data":"c65c324c9ad020309ff19d6838acd87ecd41e8fa6c1ff505358d55309fb77dd7"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.863542 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.868912 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" podStartSLOduration=6.257922033 podStartE2EDuration="28.868893807s" podCreationTimestamp="2025-11-27 16:55:51 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.535677754 +0000 UTC m=+1064.553118054" lastFinishedPulling="2025-11-27 16:56:15.146649528 +0000 UTC m=+1087.164089828" observedRunningTime="2025-11-27 16:56:19.861919878 +0000 UTC m=+1091.879360178" watchObservedRunningTime="2025-11-27 16:56:19.868893807 +0000 UTC m=+1091.886334107" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.886358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" event={"ID":"cc869191-7d3d-4192-bf48-a48625bff6ff","Type":"ContainerStarted","Data":"03b8d38825cf9b3f3ef9ea4ed67e9eb43f466f1d844e57843661446c964a47f7"} Nov 27 16:56:19 crc kubenswrapper[4954]: E1127 16:56:19.886960 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" podUID="cc869191-7d3d-4192-bf48-a48625bff6ff" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.889799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" event={"ID":"376db7a5-650f-4327-8e03-2f2be98969a0","Type":"ContainerStarted","Data":"c27f87c3c474553cd6de9e48215a48ee409077879f28284657371fd271c24b68"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.890265 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.892752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" event={"ID":"7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0","Type":"ContainerStarted","Data":"936500f6cb02f9103f7f98d3266e88aff5bbf41641415059081b0a0277c87fc3"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.893390 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.898424 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" event={"ID":"73b53349-7e1d-499f-918e-e25598787e70","Type":"ContainerStarted","Data":"965e1adb8a59000dba14ef24610149689b007b4598340309b9eb5d17c1210cfb"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.898488 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.897804 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" podStartSLOduration=3.07373516 podStartE2EDuration="29.8977847s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.075330828 +0000 UTC m=+1064.092771128" lastFinishedPulling="2025-11-27 16:56:18.899380368 +0000 UTC m=+1090.916820668" observedRunningTime="2025-11-27 16:56:19.896278054 +0000 UTC m=+1091.913718374" watchObservedRunningTime="2025-11-27 16:56:19.8977847 +0000 UTC m=+1091.915225000" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.910335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" event={"ID":"7c8dd8cc-7be7-41f9-ac93-139dc9e83274","Type":"ContainerStarted","Data":"f33ab057cd0fdfaf69b4437d9ea3411c4ed8a12dc4deffd2c7f281e9e81b1ed2"} Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.910487 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.914162 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.930980 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" podStartSLOduration=3.648672243 podStartE2EDuration="29.930952127s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.43189447 +0000 UTC m=+1064.449334770" lastFinishedPulling="2025-11-27 16:56:18.714174344 +0000 UTC m=+1090.731614654" observedRunningTime="2025-11-27 16:56:19.926331944 +0000 UTC m=+1091.943772244" watchObservedRunningTime="2025-11-27 16:56:19.930952127 +0000 UTC m=+1091.948392427" Nov 27 16:56:19 crc kubenswrapper[4954]: I1127 16:56:19.998392 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" podStartSLOduration=7.385296353 podStartE2EDuration="29.998367657s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.533270886 +0000 UTC m=+1064.550711176" lastFinishedPulling="2025-11-27 16:56:15.14634215 +0000 UTC m=+1087.163782480" observedRunningTime="2025-11-27 16:56:19.990406553 +0000 UTC m=+1092.007846853" watchObservedRunningTime="2025-11-27 16:56:19.998367657 +0000 UTC m=+1092.015807957" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.028647 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" podStartSLOduration=3.55971992 podStartE2EDuration="30.028625312s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.245937688 +0000 UTC m=+1064.263377988" lastFinishedPulling="2025-11-27 16:56:18.71484308 +0000 UTC m=+1090.732283380" observedRunningTime="2025-11-27 16:56:20.025938258 +0000 UTC m=+1092.043378558" watchObservedRunningTime="2025-11-27 16:56:20.028625312 +0000 UTC m=+1092.046065622" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.098069 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" podStartSLOduration=3.205776431 podStartE2EDuration="30.098044091s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.006070843 +0000 UTC m=+1064.023511143" lastFinishedPulling="2025-11-27 16:56:18.898338513 +0000 UTC m=+1090.915778803" observedRunningTime="2025-11-27 16:56:20.060892948 +0000 UTC m=+1092.078333248" watchObservedRunningTime="2025-11-27 16:56:20.098044091 +0000 UTC m=+1092.115484391" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.927165 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" event={"ID":"770db406-d44c-490f-8409-f5b3e8f66145","Type":"ContainerStarted","Data":"ca8fb686dee147aff18bb61d65a95898265887ac138a5c1e1067622d1fcaf79f"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.927866 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.931001 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" event={"ID":"376db7a5-650f-4327-8e03-2f2be98969a0","Type":"ContainerStarted","Data":"4f58cb76fb0f48514919f5c3341bc2926fd925d6171f6f7a75b4fbeefbbb8a84"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.933869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" event={"ID":"ff3108ae-4629-448b-80d3-949e631c60d8","Type":"ContainerStarted","Data":"cfcfb8b3f15a555e5ad57a6b32a4fc1d6e63505247a48c690bbe9450ac2cb8a0"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.936833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" event={"ID":"a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95","Type":"ContainerStarted","Data":"08d5dc6f3721e61009253ffa5322f2d3b676c3e23de0960636531034503f9cd7"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.937068 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.939958 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" event={"ID":"cbeef148-5a6f-4738-83f0-eae93d81bae3","Type":"ContainerStarted","Data":"b35eb7f278c10149e6ab342ee5efaa8abf1f022e47d6d76acec231e67a64d85d"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.940124 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.943811 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" event={"ID":"50ec526e-d6db-45fa-8b99-bd795b4c3690","Type":"ContainerStarted","Data":"70bab9506b5d3cc080889dca7d1f5b5289dba853ee1b86c0904b43780c7ab0c1"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.943943 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.947106 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" event={"ID":"11ca1308-8c7a-4a3d-a283-2533abc54c25","Type":"ContainerStarted","Data":"70bc227b8af1bf398e2f85c1201d5c619fcaa9f4da4fbb83a0dbae1b9d93badb"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.947247 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.950888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" event={"ID":"b4fb4c16-8870-494e-a075-ee70d251da46","Type":"ContainerStarted","Data":"e7a2ff148f8fc5d17ad59edb83c53979b24c970238689183cd9c8fcb6cc4d871"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.951509 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.955713 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" podStartSLOduration=3.919641274 podStartE2EDuration="30.955695681s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.221592585 +0000 UTC m=+1064.239032885" lastFinishedPulling="2025-11-27 16:56:19.257646992 +0000 UTC m=+1091.275087292" observedRunningTime="2025-11-27 16:56:20.954337987 +0000 UTC m=+1092.971778317" watchObservedRunningTime="2025-11-27 16:56:20.955695681 +0000 UTC m=+1092.973135991" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.957814 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.958097 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" event={"ID":"56f35029-dbcb-437a-94ed-3eac63c5145c","Type":"ContainerStarted","Data":"69af64c5476b25cc097178db55fef40349d953ffacb01bfef8f2a9209d8046da"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.958704 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.961368 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" event={"ID":"6a00b9f9-d61f-411d-897d-496d8c8b3501","Type":"ContainerStarted","Data":"e7b4a836c86b595cee3042acc07c4466e6cb3631258f4f74981c03de590974ac"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.961638 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.965335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" event={"ID":"6dbcc715-b375-4776-87ff-4c5ecad80975","Type":"ContainerStarted","Data":"d6eee92bf461902c899d35de9567ed13a12b27f51c4f94956effc182acdea47c"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.965570 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.968733 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" podStartSLOduration=4.723270209 podStartE2EDuration="30.968714317s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.538528174 +0000 UTC m=+1064.555968474" lastFinishedPulling="2025-11-27 16:56:18.783972282 +0000 UTC m=+1090.801412582" observedRunningTime="2025-11-27 16:56:20.097452867 +0000 UTC m=+1092.114893167" watchObservedRunningTime="2025-11-27 16:56:20.968714317 +0000 UTC m=+1092.986154627" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.971929 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" event={"ID":"c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a","Type":"ContainerStarted","Data":"fb094efb3534db6f8c3f4941fd8678bb0d352d5ac44acee779055b967c1e54e0"} Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.971995 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.973371 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.973658 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-42dmk" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.981279 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" Nov 27 16:56:20 crc kubenswrapper[4954]: E1127 16:56:20.981302 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" podUID="cc869191-7d3d-4192-bf48-a48625bff6ff" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.984239 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-mln9c" Nov 27 16:56:20 crc kubenswrapper[4954]: I1127 16:56:20.988129 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" podStartSLOduration=3.451437336 podStartE2EDuration="30.988095268s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:51.809570694 +0000 UTC m=+1063.827010994" lastFinishedPulling="2025-11-27 16:56:19.346228626 +0000 UTC m=+1091.363668926" observedRunningTime="2025-11-27 16:56:20.981731543 +0000 UTC m=+1092.999171853" watchObservedRunningTime="2025-11-27 16:56:20.988095268 +0000 UTC m=+1093.005535598" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.047938 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-8ghg2" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.051469 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" podStartSLOduration=8.451314219 podStartE2EDuration="31.051445789s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.713008467 +0000 UTC m=+1064.730448767" lastFinishedPulling="2025-11-27 16:56:15.313140037 +0000 UTC m=+1087.330580337" observedRunningTime="2025-11-27 16:56:21.011663462 +0000 UTC m=+1093.029103772" watchObservedRunningTime="2025-11-27 16:56:21.051445789 +0000 UTC m=+1093.068886239" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.062658 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" podStartSLOduration=4.103732681 podStartE2EDuration="31.062638531s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:51.969826472 +0000 UTC m=+1063.987266772" lastFinishedPulling="2025-11-27 16:56:18.928732322 +0000 UTC m=+1090.946172622" observedRunningTime="2025-11-27 16:56:21.040016891 +0000 UTC m=+1093.057457211" watchObservedRunningTime="2025-11-27 16:56:21.062638531 +0000 UTC m=+1093.080078831" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.074947 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" podStartSLOduration=27.369325346 podStartE2EDuration="31.07492139s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:56:14.978522479 +0000 UTC m=+1086.995962779" lastFinishedPulling="2025-11-27 16:56:18.684118523 +0000 UTC m=+1090.701558823" observedRunningTime="2025-11-27 16:56:21.070741188 +0000 UTC m=+1093.088181488" watchObservedRunningTime="2025-11-27 16:56:21.07492139 +0000 UTC m=+1093.092361690" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.100285 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-p55vw" podStartSLOduration=4.265409732 podStartE2EDuration="31.100270416s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.420926193 +0000 UTC m=+1064.438366513" lastFinishedPulling="2025-11-27 16:56:19.255786897 +0000 UTC m=+1091.273227197" observedRunningTime="2025-11-27 16:56:21.099859677 +0000 UTC m=+1093.117299987" watchObservedRunningTime="2025-11-27 16:56:21.100270416 +0000 UTC m=+1093.117710716" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.137802 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-nnj6l" podStartSLOduration=4.173454867 podStartE2EDuration="31.137771589s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.005744275 +0000 UTC m=+1064.023184575" lastFinishedPulling="2025-11-27 16:56:18.970060987 +0000 UTC m=+1090.987501297" observedRunningTime="2025-11-27 16:56:21.136092567 +0000 UTC m=+1093.153532867" watchObservedRunningTime="2025-11-27 16:56:21.137771589 +0000 UTC m=+1093.155211889" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.257549 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" podStartSLOduration=4.810931231 podStartE2EDuration="31.257531501s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.374474044 +0000 UTC m=+1064.391914344" lastFinishedPulling="2025-11-27 16:56:18.821074314 +0000 UTC m=+1090.838514614" observedRunningTime="2025-11-27 16:56:21.231738614 +0000 UTC m=+1093.249178914" watchObservedRunningTime="2025-11-27 16:56:21.257531501 +0000 UTC m=+1093.274971801" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.279731 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" podStartSLOduration=3.994034794 podStartE2EDuration="30.279717541s" podCreationTimestamp="2025-11-27 16:55:51 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.536091174 +0000 UTC m=+1064.553531474" lastFinishedPulling="2025-11-27 16:56:18.821773921 +0000 UTC m=+1090.839214221" observedRunningTime="2025-11-27 16:56:21.275746894 +0000 UTC m=+1093.293187194" watchObservedRunningTime="2025-11-27 16:56:21.279717541 +0000 UTC m=+1093.297157831" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.295988 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" podStartSLOduration=4.058421049 podStartE2EDuration="31.295972216s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.22672133 +0000 UTC m=+1064.244161630" lastFinishedPulling="2025-11-27 16:56:19.464272497 +0000 UTC m=+1091.481712797" observedRunningTime="2025-11-27 16:56:21.294774937 +0000 UTC m=+1093.312215237" watchObservedRunningTime="2025-11-27 16:56:21.295972216 +0000 UTC m=+1093.313412516" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.370916 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-2jpwm" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.387155 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5vqr2" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.414204 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4g8kb" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.459134 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-9pwxb" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.728041 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wr8t4" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.982166 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nz28b" Nov 27 16:56:21 crc kubenswrapper[4954]: I1127 16:56:21.984816 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-4rg5t" Nov 27 16:56:22 crc kubenswrapper[4954]: I1127 16:56:22.677678 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:22 crc kubenswrapper[4954]: I1127 16:56:22.691995 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/736ef0f4-e471-4acd-8569-2a6d6d260f67-cert\") pod \"infra-operator-controller-manager-57548d458d-4vpsc\" (UID: \"736ef0f4-e471-4acd-8569-2a6d6d260f67\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:22 crc kubenswrapper[4954]: I1127 16:56:22.908760 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5bxkf" Nov 27 16:56:22 crc kubenswrapper[4954]: I1127 16:56:22.915517 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.391960 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.409123 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eefae7c-fef6-47b3-8f89-4856b6ae1980-metrics-certs\") pod \"openstack-operator-controller-manager-556d4f4767-6wqxx\" (UID: \"7eefae7c-fef6-47b3-8f89-4856b6ae1980\") " pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.476378 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc"] Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.668201 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gxwk2" Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.676980 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:23 crc kubenswrapper[4954]: I1127 16:56:23.994450 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx"] Nov 27 16:56:24 crc kubenswrapper[4954]: I1127 16:56:24.008783 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" event={"ID":"736ef0f4-e471-4acd-8569-2a6d6d260f67","Type":"ContainerStarted","Data":"c2589de2f5f058f5dc2c7b9c1acbe902b8665453ee38f2ef51afd0f0b0fc75ed"} Nov 27 16:56:25 crc kubenswrapper[4954]: I1127 16:56:25.025653 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" event={"ID":"7eefae7c-fef6-47b3-8f89-4856b6ae1980","Type":"ContainerStarted","Data":"d9d09ef1b23bd40b31feb31a356554d7c55fb1586c9ed5906f22ad06bd4fd036"} Nov 27 16:56:25 crc kubenswrapper[4954]: I1127 16:56:25.026139 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" event={"ID":"7eefae7c-fef6-47b3-8f89-4856b6ae1980","Type":"ContainerStarted","Data":"24ff4208393e91afea097b5cd983951cae17342f64ba106c76c05b6a8d83c521"} Nov 27 16:56:25 crc kubenswrapper[4954]: I1127 16:56:25.026192 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:25 crc kubenswrapper[4954]: I1127 16:56:25.080366 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" podStartSLOduration=34.080339265 podStartE2EDuration="34.080339265s" podCreationTimestamp="2025-11-27 16:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:56:25.069288968 +0000 UTC m=+1097.086729288" watchObservedRunningTime="2025-11-27 16:56:25.080339265 +0000 UTC m=+1097.097779575" Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.035732 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" event={"ID":"8fad5f5d-c6a2-497f-8524-1ae501d6a444","Type":"ContainerStarted","Data":"94358c2e68f6161748fbe0d5a1c7b5bb99b33011255fda26eef4d18732d7831e"} Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.039729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" event={"ID":"736ef0f4-e471-4acd-8569-2a6d6d260f67","Type":"ContainerStarted","Data":"899db4c14e26f7455cdd11ca8f60a8ef2bcdbcbde0a059fa094caf23153f7ea0"} Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.039790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" event={"ID":"736ef0f4-e471-4acd-8569-2a6d6d260f67","Type":"ContainerStarted","Data":"7e87bc7b3683ab40667dd3d331e18cb9eee5ab51aced95b8fc36bedb5c19f81c"} Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.039869 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.070954 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvk89" podStartSLOduration=2.293051104 podStartE2EDuration="35.070931968s" podCreationTimestamp="2025-11-27 16:55:51 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.660748616 +0000 UTC m=+1064.678188916" lastFinishedPulling="2025-11-27 16:56:25.43862948 +0000 UTC m=+1097.456069780" observedRunningTime="2025-11-27 16:56:26.063379025 +0000 UTC m=+1098.080819325" watchObservedRunningTime="2025-11-27 16:56:26.070931968 +0000 UTC m=+1098.088372268" Nov 27 16:56:26 crc kubenswrapper[4954]: I1127 16:56:26.091401 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" podStartSLOduration=34.142500307 podStartE2EDuration="36.091376006s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:56:23.490347866 +0000 UTC m=+1095.507788186" lastFinishedPulling="2025-11-27 16:56:25.439223585 +0000 UTC m=+1097.456663885" observedRunningTime="2025-11-27 16:56:26.088113526 +0000 UTC m=+1098.105553836" watchObservedRunningTime="2025-11-27 16:56:26.091376006 +0000 UTC m=+1098.108816306" Nov 27 16:56:27 crc kubenswrapper[4954]: I1127 16:56:27.247791 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb" Nov 27 16:56:31 crc kubenswrapper[4954]: I1127 16:56:31.675192 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-nv8bz" Nov 27 16:56:31 crc kubenswrapper[4954]: I1127 16:56:31.711935 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-mmr72" Nov 27 16:56:31 crc kubenswrapper[4954]: I1127 16:56:31.761737 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-wg8x7" Nov 27 16:56:31 crc kubenswrapper[4954]: I1127 16:56:31.770307 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7dmz6" Nov 27 16:56:31 crc kubenswrapper[4954]: I1127 16:56:31.990364 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-dzjch" Nov 27 16:56:32 crc kubenswrapper[4954]: I1127 16:56:32.931737 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" Nov 27 16:56:33 crc kubenswrapper[4954]: I1127 16:56:33.689697 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-556d4f4767-6wqxx" Nov 27 16:56:54 crc kubenswrapper[4954]: I1127 16:56:54.324997 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" event={"ID":"cc869191-7d3d-4192-bf48-a48625bff6ff","Type":"ContainerStarted","Data":"956bb45767b432530a640edbeff46381d55fc2ba255aed6bed7790cd9043bbf8"} Nov 27 16:56:56 crc kubenswrapper[4954]: I1127 16:56:56.348202 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:56:56 crc kubenswrapper[4954]: I1127 16:56:56.404032 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" podStartSLOduration=5.484881189 podStartE2EDuration="1m6.403998823s" podCreationTimestamp="2025-11-27 16:55:50 +0000 UTC" firstStartedPulling="2025-11-27 16:55:52.232320356 +0000 UTC m=+1064.249760646" lastFinishedPulling="2025-11-27 16:56:53.15143797 +0000 UTC m=+1125.168878280" observedRunningTime="2025-11-27 16:56:56.369284039 +0000 UTC m=+1128.386724339" watchObservedRunningTime="2025-11-27 16:56:56.403998823 +0000 UTC m=+1128.421439133" Nov 27 16:57:01 crc kubenswrapper[4954]: I1127 16:57:01.339572 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-bw2j9" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.952329 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.956988 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.959987 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.960082 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.960112 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lqx99" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.960153 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 27 16:57:17 crc kubenswrapper[4954]: I1127 16:57:17.971378 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.006616 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.006731 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngxm9\" (UniqueName: \"kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.037064 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.038858 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.056274 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.057446 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.108002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngxm9\" (UniqueName: \"kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.108078 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.108136 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.108157 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4z4l\" (UniqueName: \"kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.108216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.109294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.151612 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngxm9\" (UniqueName: \"kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9\") pod \"dnsmasq-dns-675f4bcbfc-bspkf\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.209587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.209690 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.209727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4z4l\" (UniqueName: \"kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.210510 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.210851 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.237857 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4z4l\" (UniqueName: \"kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l\") pod \"dnsmasq-dns-78dd6ddcc-9lr8q\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.274194 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.356543 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.819204 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.827643 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:57:18 crc kubenswrapper[4954]: I1127 16:57:18.900896 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:19 crc kubenswrapper[4954]: I1127 16:57:19.613378 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" event={"ID":"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d","Type":"ContainerStarted","Data":"0b12164c42c89ec06e54f46382f37650c8924e845508a5544593ff255e364155"} Nov 27 16:57:19 crc kubenswrapper[4954]: I1127 16:57:19.616623 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" event={"ID":"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2","Type":"ContainerStarted","Data":"a372427ed285e86879adcaa4cd7cf5c65a4513a483fd99c4b29a7576e0cc9162"} Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.053607 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.094030 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.095631 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.116742 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.155926 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.155990 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm46q\" (UniqueName: \"kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.156047 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.267346 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.267391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm46q\" (UniqueName: \"kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.267428 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.268459 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.268892 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.322997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm46q\" (UniqueName: \"kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q\") pod \"dnsmasq-dns-666b6646f7-z47sq\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.393435 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.438646 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.440540 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.441044 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.450606 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.475956 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.476162 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.476268 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkblm\" (UniqueName: \"kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.578134 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.578263 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.578325 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkblm\" (UniqueName: \"kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.579158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.579421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.600896 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkblm\" (UniqueName: \"kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm\") pod \"dnsmasq-dns-57d769cc4f-nnk5z\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.772449 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:21 crc kubenswrapper[4954]: I1127 16:57:21.849885 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:21 crc kubenswrapper[4954]: W1127 16:57:21.860509 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc06d93e5_61ab_4c83_8371_6f0bb226349f.slice/crio-26e19c4d882e01c9cafba02ccf82c35076346bffba238e1eeb4d173fe8cbc546 WatchSource:0}: Error finding container 26e19c4d882e01c9cafba02ccf82c35076346bffba238e1eeb4d173fe8cbc546: Status 404 returned error can't find the container with id 26e19c4d882e01c9cafba02ccf82c35076346bffba238e1eeb4d173fe8cbc546 Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.240407 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.241641 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.254290 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.264367 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.255881 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sg67g" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.256052 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.256091 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.256636 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.258212 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.260722 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.296890 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.296937 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.296989 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297007 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297101 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297120 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297158 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.297199 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkkx\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.298513 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.298586 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.332653 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400004 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400134 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400167 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400225 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400243 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400261 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400283 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.400303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkkx\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.402526 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.402611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.402642 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.402820 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.402962 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.404731 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.408558 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.409318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.409939 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.411889 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.418416 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkkx\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.441160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.557454 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.558717 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.560312 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.561078 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.563594 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.565137 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.565313 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.565473 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.567691 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fpcb4" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.573881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.574227 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.603182 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsx6\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.603243 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.603277 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.603309 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.603829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.604057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.604122 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.604166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.604208 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.604261 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.609641 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.689826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" event={"ID":"c06d93e5-61ab-4c83-8371-6f0bb226349f","Type":"ContainerStarted","Data":"26e19c4d882e01c9cafba02ccf82c35076346bffba238e1eeb4d173fe8cbc546"} Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710787 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710912 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710940 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710963 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.710990 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.711012 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.711055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsx6\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.711072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.711094 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.711115 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.714787 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.715059 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.715065 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.715459 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.718426 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.718432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.718880 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.729282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.729765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.731771 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsx6\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.737947 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.768063 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:22 crc kubenswrapper[4954]: I1127 16:57:22.904245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.790028 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.791385 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.795321 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.797612 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nlspx" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.797904 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.798485 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.802290 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.803148 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933138 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snx9l\" (UniqueName: \"kubernetes.io/projected/591d8033-08c2-4048-b24e-34508babfbad-kube-api-access-snx9l\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933204 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/591d8033-08c2-4048-b24e-34508babfbad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933361 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-kolla-config\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933381 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933401 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-config-data-default\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:23 crc kubenswrapper[4954]: I1127 16:57:23.933449 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038476 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snx9l\" (UniqueName: \"kubernetes.io/projected/591d8033-08c2-4048-b24e-34508babfbad-kube-api-access-snx9l\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038559 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038630 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038676 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/591d8033-08c2-4048-b24e-34508babfbad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038702 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-kolla-config\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038745 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-config-data-default\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.038821 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.039449 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.041417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-kolla-config\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.041753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-config-data-default\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.042657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591d8033-08c2-4048-b24e-34508babfbad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.045489 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.052395 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/591d8033-08c2-4048-b24e-34508babfbad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.065779 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/591d8033-08c2-4048-b24e-34508babfbad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.082494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snx9l\" (UniqueName: \"kubernetes.io/projected/591d8033-08c2-4048-b24e-34508babfbad-kube-api-access-snx9l\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.086767 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"591d8033-08c2-4048-b24e-34508babfbad\") " pod="openstack/openstack-galera-0" Nov 27 16:57:24 crc kubenswrapper[4954]: I1127 16:57:24.114632 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.272961 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.274775 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.277347 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.277404 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.277569 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wvpmg" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.278059 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.290795 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.470727 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.470830 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.470871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2vgc\" (UniqueName: \"kubernetes.io/projected/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kube-api-access-f2vgc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.470910 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.471009 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.471050 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.471133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.471239 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572902 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572925 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2vgc\" (UniqueName: \"kubernetes.io/projected/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kube-api-access-f2vgc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572944 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.572988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.573020 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.573082 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.573288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.573723 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.573749 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.576981 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.580724 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.581380 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.592539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.598478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2vgc\" (UniqueName: \"kubernetes.io/projected/3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac-kube-api-access-f2vgc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.601832 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.617926 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.619160 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.623062 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vmghf" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.623287 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.626142 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.638649 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.681487 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.776383 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-kolla-config\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.776437 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.776471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p82q4\" (UniqueName: \"kubernetes.io/projected/808630a2-42dd-48c9-a004-749515cb771b-kube-api-access-p82q4\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.776520 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-config-data\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.776542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.878439 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-kolla-config\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.878512 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.878557 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p82q4\" (UniqueName: \"kubernetes.io/projected/808630a2-42dd-48c9-a004-749515cb771b-kube-api-access-p82q4\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.878629 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-config-data\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.878649 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.884944 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-config-data\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.885283 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/808630a2-42dd-48c9-a004-749515cb771b-kolla-config\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.886225 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.887979 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808630a2-42dd-48c9-a004-749515cb771b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.906934 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p82q4\" (UniqueName: \"kubernetes.io/projected/808630a2-42dd-48c9-a004-749515cb771b-kube-api-access-p82q4\") pod \"memcached-0\" (UID: \"808630a2-42dd-48c9-a004-749515cb771b\") " pod="openstack/memcached-0" Nov 27 16:57:25 crc kubenswrapper[4954]: I1127 16:57:25.966102 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 16:57:26 crc kubenswrapper[4954]: W1127 16:57:26.737925 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod644e4b3e_4237_4179_8775_63cde7f94338.slice/crio-cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507 WatchSource:0}: Error finding container cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507: Status 404 returned error can't find the container with id cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507 Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.491368 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.493049 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.495251 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4vhqb" Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.541131 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.613025 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxr8\" (UniqueName: \"kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8\") pod \"kube-state-metrics-0\" (UID: \"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f\") " pod="openstack/kube-state-metrics-0" Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.714391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whxr8\" (UniqueName: \"kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8\") pod \"kube-state-metrics-0\" (UID: \"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f\") " pod="openstack/kube-state-metrics-0" Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.740766 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" event={"ID":"644e4b3e-4237-4179-8775-63cde7f94338","Type":"ContainerStarted","Data":"cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507"} Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.751913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxr8\" (UniqueName: \"kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8\") pod \"kube-state-metrics-0\" (UID: \"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f\") " pod="openstack/kube-state-metrics-0" Nov 27 16:57:27 crc kubenswrapper[4954]: I1127 16:57:27.860193 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.371083 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7sc8"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.372521 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.375130 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.376253 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-z8jfw" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.377025 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.392322 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7sc8"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.455318 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-btgpk"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.456932 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.469895 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-btgpk"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484434 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-combined-ca-bundle\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484569 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-log-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc2j\" (UniqueName: \"kubernetes.io/projected/2a98905f-a2dd-4eb2-9a4f-437eb3626871-kube-api-access-mtc2j\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484873 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-ovn-controller-tls-certs\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a98905f-a2dd-4eb2-9a4f-437eb3626871-scripts\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.484998 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586194 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jkc\" (UniqueName: \"kubernetes.io/projected/abad518f-43af-457b-add5-c0291513ad71-kube-api-access-55jkc\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-lib\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586332 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586356 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-combined-ca-bundle\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586380 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-log\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586401 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-log-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586423 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc2j\" (UniqueName: \"kubernetes.io/projected/2a98905f-a2dd-4eb2-9a4f-437eb3626871-kube-api-access-mtc2j\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586441 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-ovn-controller-tls-certs\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586467 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-run\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586487 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a98905f-a2dd-4eb2-9a4f-437eb3626871-scripts\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586506 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586531 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-etc-ovs\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586553 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad518f-43af-457b-add5-c0291513ad71-scripts\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.586950 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.587052 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-log-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.587081 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a98905f-a2dd-4eb2-9a4f-437eb3626871-var-run-ovn\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.588412 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a98905f-a2dd-4eb2-9a4f-437eb3626871-scripts\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.591742 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-ovn-controller-tls-certs\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.595565 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98905f-a2dd-4eb2-9a4f-437eb3626871-combined-ca-bundle\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.604168 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc2j\" (UniqueName: \"kubernetes.io/projected/2a98905f-a2dd-4eb2-9a4f-437eb3626871-kube-api-access-mtc2j\") pod \"ovn-controller-s7sc8\" (UID: \"2a98905f-a2dd-4eb2-9a4f-437eb3626871\") " pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.687266 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688502 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55jkc\" (UniqueName: \"kubernetes.io/projected/abad518f-43af-457b-add5-c0291513ad71-kube-api-access-55jkc\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688628 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-lib\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688717 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-log\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688779 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-run\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688829 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-etc-ovs\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688859 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad518f-43af-457b-add5-c0291513ad71-scripts\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.688948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-run\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.689094 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-log\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.689160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-etc-ovs\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.689298 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abad518f-43af-457b-add5-c0291513ad71-var-lib\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.691130 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad518f-43af-457b-add5-c0291513ad71-scripts\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.691217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.693458 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.693535 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.693852 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.693906 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.694300 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mr9bm" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.700233 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.724447 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jkc\" (UniqueName: \"kubernetes.io/projected/abad518f-43af-457b-add5-c0291513ad71-kube-api-access-55jkc\") pod \"ovn-controller-ovs-btgpk\" (UID: \"abad518f-43af-457b-add5-c0291513ad71\") " pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.770328 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.790366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.790570 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.790730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsn2\" (UniqueName: \"kubernetes.io/projected/585214bf-1a7b-426d-b1a6-d26e69e0116f-kube-api-access-7gsn2\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.790857 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.790963 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.791068 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.791200 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.791325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-config\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.893308 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.893811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.893970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsn2\" (UniqueName: \"kubernetes.io/projected/585214bf-1a7b-426d-b1a6-d26e69e0116f-kube-api-access-7gsn2\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894103 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894221 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894340 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894516 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894699 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-config\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894400 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.894871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.895038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.896032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585214bf-1a7b-426d-b1a6-d26e69e0116f-config\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.898797 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.899340 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.906900 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585214bf-1a7b-426d-b1a6-d26e69e0116f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.918366 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsn2\" (UniqueName: \"kubernetes.io/projected/585214bf-1a7b-426d-b1a6-d26e69e0116f-kube-api-access-7gsn2\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:31 crc kubenswrapper[4954]: I1127 16:57:31.923336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"585214bf-1a7b-426d-b1a6-d26e69e0116f\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:32 crc kubenswrapper[4954]: I1127 16:57:32.011568 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.481971 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.483713 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.485638 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.485872 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q7v6m" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.486727 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.500214 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.503496 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648022 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648292 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648359 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648444 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648647 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmgm\" (UniqueName: \"kubernetes.io/projected/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-kube-api-access-5qmgm\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.648742 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.749827 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.749910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmgm\" (UniqueName: \"kubernetes.io/projected/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-kube-api-access-5qmgm\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.749942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.749966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750014 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.750823 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.751172 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.751346 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.763316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.764690 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.766088 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.773563 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmgm\" (UniqueName: \"kubernetes.io/projected/6f8f5ac1-9978-4d8a-b12d-f902e9cb316c-kube-api-access-5qmgm\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.774546 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:34 crc kubenswrapper[4954]: I1127 16:57:34.852740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.700621 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.701755 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngxm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bspkf_openstack(8a2adbaa-5cd1-4563-a5d2-25fe130e90a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.702892 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" podUID="8a2adbaa-5cd1-4563-a5d2-25fe130e90a2" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.709969 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.710229 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4z4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9lr8q_openstack(e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.711534 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" podUID="e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.748085 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.748271 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pm46q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-z47sq_openstack(c06d93e5-61ab-4c83-8371-6f0bb226349f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.749478 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" podUID="c06d93e5-61ab-4c83-8371-6f0bb226349f" Nov 27 16:57:39 crc kubenswrapper[4954]: E1127 16:57:39.861382 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" podUID="c06d93e5-61ab-4c83-8371-6f0bb226349f" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.123206 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 16:57:40 crc kubenswrapper[4954]: W1127 16:57:40.145907 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod808630a2_42dd_48c9_a004_749515cb771b.slice/crio-92031f0a5f87dd8e2d395e33fd58cc8e46153d98ed98b80c45dbdd3e6bdc3e3d WatchSource:0}: Error finding container 92031f0a5f87dd8e2d395e33fd58cc8e46153d98ed98b80c45dbdd3e6bdc3e3d: Status 404 returned error can't find the container with id 92031f0a5f87dd8e2d395e33fd58cc8e46153d98ed98b80c45dbdd3e6bdc3e3d Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.352215 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.471006 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4z4l\" (UniqueName: \"kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l\") pod \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.471145 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config\") pod \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.471224 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc\") pod \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\" (UID: \"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d\") " Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.472106 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d" (UID: "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.473378 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config" (OuterVolumeSpecName: "config") pod "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d" (UID: "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.520842 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l" (OuterVolumeSpecName: "kube-api-access-d4z4l") pod "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d" (UID: "e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d"). InnerVolumeSpecName "kube-api-access-d4z4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.550772 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.573368 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4z4l\" (UniqueName: \"kubernetes.io/projected/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-kube-api-access-d4z4l\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.573410 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.573420 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.602974 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.674306 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngxm9\" (UniqueName: \"kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9\") pod \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.674386 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config\") pod \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\" (UID: \"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2\") " Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.675365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config" (OuterVolumeSpecName: "config") pod "8a2adbaa-5cd1-4563-a5d2-25fe130e90a2" (UID: "8a2adbaa-5cd1-4563-a5d2-25fe130e90a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.679793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9" (OuterVolumeSpecName: "kube-api-access-ngxm9") pod "8a2adbaa-5cd1-4563-a5d2-25fe130e90a2" (UID: "8a2adbaa-5cd1-4563-a5d2-25fe130e90a2"). InnerVolumeSpecName "kube-api-access-ngxm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.777544 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngxm9\" (UniqueName: \"kubernetes.io/projected/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-kube-api-access-ngxm9\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.777616 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.798096 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-btgpk"] Nov 27 16:57:40 crc kubenswrapper[4954]: W1127 16:57:40.800102 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabad518f_43af_457b_add5_c0291513ad71.slice/crio-75819c2b346af7a5ed983ca8545a5ab3dcbe38eb731407b25002057d5f8929c2 WatchSource:0}: Error finding container 75819c2b346af7a5ed983ca8545a5ab3dcbe38eb731407b25002057d5f8929c2: Status 404 returned error can't find the container with id 75819c2b346af7a5ed983ca8545a5ab3dcbe38eb731407b25002057d5f8929c2 Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.865272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-btgpk" event={"ID":"abad518f-43af-457b-add5-c0291513ad71","Type":"ContainerStarted","Data":"75819c2b346af7a5ed983ca8545a5ab3dcbe38eb731407b25002057d5f8929c2"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.867984 4954 generic.go:334] "Generic (PLEG): container finished" podID="644e4b3e-4237-4179-8775-63cde7f94338" containerID="b1f2eaab9a5ed1abaf1286b1d6949f8f8e96735d163d7d6922e3cdaa7fabe3a4" exitCode=0 Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.868024 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" event={"ID":"644e4b3e-4237-4179-8775-63cde7f94338","Type":"ContainerDied","Data":"b1f2eaab9a5ed1abaf1286b1d6949f8f8e96735d163d7d6922e3cdaa7fabe3a4"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.870287 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"808630a2-42dd-48c9-a004-749515cb771b","Type":"ContainerStarted","Data":"92031f0a5f87dd8e2d395e33fd58cc8e46153d98ed98b80c45dbdd3e6bdc3e3d"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.871880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" event={"ID":"e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d","Type":"ContainerDied","Data":"0b12164c42c89ec06e54f46382f37650c8924e845508a5544593ff255e364155"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.871908 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9lr8q" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.881382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" event={"ID":"8a2adbaa-5cd1-4563-a5d2-25fe130e90a2","Type":"ContainerDied","Data":"a372427ed285e86879adcaa4cd7cf5c65a4513a483fd99c4b29a7576e0cc9162"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.881522 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bspkf" Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.908741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerStarted","Data":"2da83980add47da06eafdbe9ac1cfdf9a05941ca38fef6bbf81346394596ac87"} Nov 27 16:57:40 crc kubenswrapper[4954]: I1127 16:57:40.923989 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.029735 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.058684 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9lr8q"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.096733 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.105342 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.150881 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.161833 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bspkf"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.175042 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.182380 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7sc8"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.187430 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.796241 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.924073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"591d8033-08c2-4048-b24e-34508babfbad","Type":"ContainerStarted","Data":"fe380ac22f70eecf9b0bd2828c5598c424e0ddded124769767bb6cea757bf4bb"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.926403 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerStarted","Data":"38a9b5751e9900ed4baa2ab57a7c849281afa825b2eac1b101b46d798f857244"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.930482 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f","Type":"ContainerStarted","Data":"a4ba405ace0565e8e77cb3955b4a2802005675ec9645d80dee7c6897b58bd55e"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.933036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c","Type":"ContainerStarted","Data":"1ee6f94d09df47ef6bc410ad501764225def446cfe7b12100a90b0e0a3dc4825"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.934179 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8" event={"ID":"2a98905f-a2dd-4eb2-9a4f-437eb3626871","Type":"ContainerStarted","Data":"9d0ed3e78d9b8fd32dd204ec39127d8be507ee6d5f6fbbc007f1ba2d05e257c3"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.935388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac","Type":"ContainerStarted","Data":"6f19961442f524a9f74135b21405ee7fadd302c4ba0190548ae60a4305882b83"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.949902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" event={"ID":"644e4b3e-4237-4179-8775-63cde7f94338","Type":"ContainerStarted","Data":"3ad4afc8238191c70f11816ed77f9229615fb78f43db6c58aa1b400f777eb469"} Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.950085 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:41 crc kubenswrapper[4954]: I1127 16:57:41.970721 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" podStartSLOduration=7.800755741 podStartE2EDuration="20.970699065s" podCreationTimestamp="2025-11-27 16:57:21 +0000 UTC" firstStartedPulling="2025-11-27 16:57:26.74365964 +0000 UTC m=+1158.761099960" lastFinishedPulling="2025-11-27 16:57:39.913602984 +0000 UTC m=+1171.931043284" observedRunningTime="2025-11-27 16:57:41.964710959 +0000 UTC m=+1173.982151259" watchObservedRunningTime="2025-11-27 16:57:41.970699065 +0000 UTC m=+1173.988139365" Nov 27 16:57:42 crc kubenswrapper[4954]: I1127 16:57:42.671673 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2adbaa-5cd1-4563-a5d2-25fe130e90a2" path="/var/lib/kubelet/pods/8a2adbaa-5cd1-4563-a5d2-25fe130e90a2/volumes" Nov 27 16:57:42 crc kubenswrapper[4954]: I1127 16:57:42.672034 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d" path="/var/lib/kubelet/pods/e3bab8ba-ab5f-42d5-86cc-0c89c8b74c6d/volumes" Nov 27 16:57:42 crc kubenswrapper[4954]: I1127 16:57:42.958561 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"585214bf-1a7b-426d-b1a6-d26e69e0116f","Type":"ContainerStarted","Data":"d9a9900a304c2a3a5c46d8e783a8505601a17d99989b05e14cc1cef906b29f8a"} Nov 27 16:57:46 crc kubenswrapper[4954]: I1127 16:57:46.776407 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:57:46 crc kubenswrapper[4954]: I1127 16:57:46.841617 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.365166 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.522607 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config\") pod \"c06d93e5-61ab-4c83-8371-6f0bb226349f\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.522710 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc\") pod \"c06d93e5-61ab-4c83-8371-6f0bb226349f\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.522760 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm46q\" (UniqueName: \"kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q\") pod \"c06d93e5-61ab-4c83-8371-6f0bb226349f\" (UID: \"c06d93e5-61ab-4c83-8371-6f0bb226349f\") " Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.523458 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c06d93e5-61ab-4c83-8371-6f0bb226349f" (UID: "c06d93e5-61ab-4c83-8371-6f0bb226349f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.524197 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config" (OuterVolumeSpecName: "config") pod "c06d93e5-61ab-4c83-8371-6f0bb226349f" (UID: "c06d93e5-61ab-4c83-8371-6f0bb226349f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.529549 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q" (OuterVolumeSpecName: "kube-api-access-pm46q") pod "c06d93e5-61ab-4c83-8371-6f0bb226349f" (UID: "c06d93e5-61ab-4c83-8371-6f0bb226349f"). InnerVolumeSpecName "kube-api-access-pm46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.624587 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.624629 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06d93e5-61ab-4c83-8371-6f0bb226349f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:48 crc kubenswrapper[4954]: I1127 16:57:48.624643 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm46q\" (UniqueName: \"kubernetes.io/projected/c06d93e5-61ab-4c83-8371-6f0bb226349f-kube-api-access-pm46q\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:49 crc kubenswrapper[4954]: I1127 16:57:49.003302 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" event={"ID":"c06d93e5-61ab-4c83-8371-6f0bb226349f","Type":"ContainerDied","Data":"26e19c4d882e01c9cafba02ccf82c35076346bffba238e1eeb4d173fe8cbc546"} Nov 27 16:57:49 crc kubenswrapper[4954]: I1127 16:57:49.003395 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z47sq" Nov 27 16:57:49 crc kubenswrapper[4954]: I1127 16:57:49.054815 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:49 crc kubenswrapper[4954]: I1127 16:57:49.062709 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z47sq"] Nov 27 16:57:50 crc kubenswrapper[4954]: I1127 16:57:50.671876 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06d93e5-61ab-4c83-8371-6f0bb226349f" path="/var/lib/kubelet/pods/c06d93e5-61ab-4c83-8371-6f0bb226349f/volumes" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.022380 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f","Type":"ContainerStarted","Data":"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.022919 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.024894 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c","Type":"ContainerStarted","Data":"192123b846f06188765ac07c048cde90acf390422b43d2b5d0259d013846d2be"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.027208 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8" event={"ID":"2a98905f-a2dd-4eb2-9a4f-437eb3626871","Type":"ContainerStarted","Data":"cb12410cc06f4479fb5a52ab0c72f26949389bb706ddd5f1112c1e4e69151604"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.027396 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s7sc8" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.029109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac","Type":"ContainerStarted","Data":"3195b5ede1259bd7a96be20b42de3f24bb065c4fc6ee3b64ff33503591d2ddb7"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.031836 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"585214bf-1a7b-426d-b1a6-d26e69e0116f","Type":"ContainerStarted","Data":"a7f2c3f6a10ac2f7a2081cb215a1acc7b93f81e816acb27afcc8ddfb32377b24"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.034182 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-btgpk" event={"ID":"abad518f-43af-457b-add5-c0291513ad71","Type":"ContainerStarted","Data":"c4a69f67e9cbcf06dfc89d6ec0d90e54761f9ead1b7eed621ff0e9b53f10a8bd"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.037446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"591d8033-08c2-4048-b24e-34508babfbad","Type":"ContainerStarted","Data":"d0a1ac2512cc5b5f4ce3c345ca41bd7b4f3c93e9837775c80ee564afa57012a0"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.047141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"808630a2-42dd-48c9-a004-749515cb771b","Type":"ContainerStarted","Data":"002c55881c13289c2ff855f4cd4d8a1c28700a84cf2a51331ee1dd84336163e3"} Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.047309 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.052534 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.706458431 podStartE2EDuration="24.052506193s" podCreationTimestamp="2025-11-27 16:57:27 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.100080529 +0000 UTC m=+1173.117520829" lastFinishedPulling="2025-11-27 16:57:50.446128291 +0000 UTC m=+1182.463568591" observedRunningTime="2025-11-27 16:57:51.039447196 +0000 UTC m=+1183.056887506" watchObservedRunningTime="2025-11-27 16:57:51.052506193 +0000 UTC m=+1183.069946503" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.162028 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s7sc8" podStartSLOduration=11.845630246 podStartE2EDuration="20.161555394s" podCreationTimestamp="2025-11-27 16:57:31 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.151637273 +0000 UTC m=+1173.169077573" lastFinishedPulling="2025-11-27 16:57:49.467562421 +0000 UTC m=+1181.485002721" observedRunningTime="2025-11-27 16:57:51.157076575 +0000 UTC m=+1183.174516885" watchObservedRunningTime="2025-11-27 16:57:51.161555394 +0000 UTC m=+1183.178995694" Nov 27 16:57:51 crc kubenswrapper[4954]: I1127 16:57:51.176272 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.571801568 podStartE2EDuration="26.176245541s" podCreationTimestamp="2025-11-27 16:57:25 +0000 UTC" firstStartedPulling="2025-11-27 16:57:40.156054379 +0000 UTC m=+1172.173494679" lastFinishedPulling="2025-11-27 16:57:48.760498352 +0000 UTC m=+1180.777938652" observedRunningTime="2025-11-27 16:57:51.17334539 +0000 UTC m=+1183.190785710" watchObservedRunningTime="2025-11-27 16:57:51.176245541 +0000 UTC m=+1183.193685841" Nov 27 16:57:52 crc kubenswrapper[4954]: I1127 16:57:52.057842 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerStarted","Data":"36df7c7eb591cb47cfc65798bc7acff77ecfcbcf7991a576639fa7c256680166"} Nov 27 16:57:52 crc kubenswrapper[4954]: I1127 16:57:52.061683 4954 generic.go:334] "Generic (PLEG): container finished" podID="abad518f-43af-457b-add5-c0291513ad71" containerID="c4a69f67e9cbcf06dfc89d6ec0d90e54761f9ead1b7eed621ff0e9b53f10a8bd" exitCode=0 Nov 27 16:57:52 crc kubenswrapper[4954]: I1127 16:57:52.061756 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-btgpk" event={"ID":"abad518f-43af-457b-add5-c0291513ad71","Type":"ContainerDied","Data":"c4a69f67e9cbcf06dfc89d6ec0d90e54761f9ead1b7eed621ff0e9b53f10a8bd"} Nov 27 16:57:52 crc kubenswrapper[4954]: I1127 16:57:52.063723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerStarted","Data":"445f8d4ba8edbf32d835aee9867360a8ad19116e7317ee02107d314c316b88c3"} Nov 27 16:57:53 crc kubenswrapper[4954]: I1127 16:57:53.076853 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-btgpk" event={"ID":"abad518f-43af-457b-add5-c0291513ad71","Type":"ContainerStarted","Data":"ecbb0727c71fff308fbfb2a0d1b42be8c09824bd7619af8de87eba48bcbf2f6a"} Nov 27 16:57:53 crc kubenswrapper[4954]: I1127 16:57:53.077395 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-btgpk" event={"ID":"abad518f-43af-457b-add5-c0291513ad71","Type":"ContainerStarted","Data":"e0b7429268a5918af96d399a1f7fd5e852f63fc146410642526289333d29077f"} Nov 27 16:57:53 crc kubenswrapper[4954]: I1127 16:57:53.111464 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-btgpk" podStartSLOduration=13.608973615 podStartE2EDuration="22.111434788s" podCreationTimestamp="2025-11-27 16:57:31 +0000 UTC" firstStartedPulling="2025-11-27 16:57:40.804305609 +0000 UTC m=+1172.821745909" lastFinishedPulling="2025-11-27 16:57:49.306766772 +0000 UTC m=+1181.324207082" observedRunningTime="2025-11-27 16:57:53.098138624 +0000 UTC m=+1185.115578934" watchObservedRunningTime="2025-11-27 16:57:53.111434788 +0000 UTC m=+1185.128875088" Nov 27 16:57:53 crc kubenswrapper[4954]: I1127 16:57:53.687421 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:57:53 crc kubenswrapper[4954]: I1127 16:57:53.687511 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:57:54 crc kubenswrapper[4954]: I1127 16:57:54.084108 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:54 crc kubenswrapper[4954]: I1127 16:57:54.084489 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.094927 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f8f5ac1-9978-4d8a-b12d-f902e9cb316c","Type":"ContainerStarted","Data":"2ac108968283e9803ca32632074cd9af0989412447947ae03b2356e5aa46c195"} Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.098313 4954 generic.go:334] "Generic (PLEG): container finished" podID="3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac" containerID="3195b5ede1259bd7a96be20b42de3f24bb065c4fc6ee3b64ff33503591d2ddb7" exitCode=0 Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.098407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac","Type":"ContainerDied","Data":"3195b5ede1259bd7a96be20b42de3f24bb065c4fc6ee3b64ff33503591d2ddb7"} Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.100231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"585214bf-1a7b-426d-b1a6-d26e69e0116f","Type":"ContainerStarted","Data":"ec181ce05c540ae383739705637b389f933c93490149f6a329311d0c69e119d0"} Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.101424 4954 generic.go:334] "Generic (PLEG): container finished" podID="591d8033-08c2-4048-b24e-34508babfbad" containerID="d0a1ac2512cc5b5f4ce3c345ca41bd7b4f3c93e9837775c80ee564afa57012a0" exitCode=0 Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.101650 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"591d8033-08c2-4048-b24e-34508babfbad","Type":"ContainerDied","Data":"d0a1ac2512cc5b5f4ce3c345ca41bd7b4f3c93e9837775c80ee564afa57012a0"} Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.116111 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.869986397 podStartE2EDuration="22.116093692s" podCreationTimestamp="2025-11-27 16:57:33 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.169675722 +0000 UTC m=+1173.187116022" lastFinishedPulling="2025-11-27 16:57:54.415783017 +0000 UTC m=+1186.433223317" observedRunningTime="2025-11-27 16:57:55.115369515 +0000 UTC m=+1187.132809815" watchObservedRunningTime="2025-11-27 16:57:55.116093692 +0000 UTC m=+1187.133533992" Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.169959 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.565758213 podStartE2EDuration="25.169941672s" podCreationTimestamp="2025-11-27 16:57:30 +0000 UTC" firstStartedPulling="2025-11-27 16:57:42.821050358 +0000 UTC m=+1174.838490658" lastFinishedPulling="2025-11-27 16:57:54.425233817 +0000 UTC m=+1186.442674117" observedRunningTime="2025-11-27 16:57:55.163640638 +0000 UTC m=+1187.181080948" watchObservedRunningTime="2025-11-27 16:57:55.169941672 +0000 UTC m=+1187.187381972" Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.853982 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.916543 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:55 crc kubenswrapper[4954]: I1127 16:57:55.969883 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.012237 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.067040 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.111665 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac","Type":"ContainerStarted","Data":"7b519fa1a32f823e8018107d1dbc00250acc863ff6c5068d8d6c1346e809b659"} Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.114279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"591d8033-08c2-4048-b24e-34508babfbad","Type":"ContainerStarted","Data":"1d7ae19d496126029c39d951e707539009bd085d15866c697295de76542ea19e"} Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.114364 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.114910 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.137434 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.860629025 podStartE2EDuration="32.137417442s" podCreationTimestamp="2025-11-27 16:57:24 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.102516989 +0000 UTC m=+1173.119957289" lastFinishedPulling="2025-11-27 16:57:49.379305396 +0000 UTC m=+1181.396745706" observedRunningTime="2025-11-27 16:57:56.13320947 +0000 UTC m=+1188.150649770" watchObservedRunningTime="2025-11-27 16:57:56.137417442 +0000 UTC m=+1188.154857742" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.162806 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.736040036 podStartE2EDuration="34.162787579s" podCreationTimestamp="2025-11-27 16:57:22 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.040420559 +0000 UTC m=+1173.057860859" lastFinishedPulling="2025-11-27 16:57:49.467168092 +0000 UTC m=+1181.484608402" observedRunningTime="2025-11-27 16:57:56.156919007 +0000 UTC m=+1188.174359307" watchObservedRunningTime="2025-11-27 16:57:56.162787579 +0000 UTC m=+1188.180227879" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.174222 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.175438 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.477214 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-bgvc6"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.478485 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.481312 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.510837 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-bgvc6"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.550422 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wnm29"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.551453 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.557868 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.587097 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wnm29"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.615445 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.615510 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.615594 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.615635 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v62hf\" (UniqueName: \"kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.665027 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-bgvc6"] Nov 27 16:57:56 crc kubenswrapper[4954]: E1127 16:57:56.667275 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-v62hf ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" podUID="eb30983b-ac35-4d76-b97e-2aaa70860a13" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.701318 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.702388 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.703342 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.706730 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.707358 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.707558 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.707745 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-546vg" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.709682 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.715919 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717092 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-combined-ca-bundle\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717170 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717188 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wbvr\" (UniqueName: \"kubernetes.io/projected/49863f24-1603-49e2-835c-31ced01d9f7f-kube-api-access-7wbvr\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717211 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovs-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717239 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v62hf\" (UniqueName: \"kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717257 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovn-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49863f24-1603-49e2-835c-31ced01d9f7f-config\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.717343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.718289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.718664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.718913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.729117 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.754657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v62hf\" (UniqueName: \"kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf\") pod \"dnsmasq-dns-5bf47b49b7-bgvc6\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.782192 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819194 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819242 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819284 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wbvr\" (UniqueName: \"kubernetes.io/projected/49863f24-1603-49e2-835c-31ced01d9f7f-kube-api-access-7wbvr\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovs-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819357 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovn-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819377 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819403 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819436 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49863f24-1603-49e2-835c-31ced01d9f7f-config\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819497 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzlfn\" (UniqueName: \"kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4twg\" (UniqueName: \"kubernetes.io/projected/d11a38a9-30c1-44d2-81ca-965f0dfbde96-kube-api-access-b4twg\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819545 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819565 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-scripts\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819599 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819619 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-config\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819640 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-combined-ca-bundle\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.819662 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.820657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49863f24-1603-49e2-835c-31ced01d9f7f-config\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.821434 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovs-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.821793 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49863f24-1603-49e2-835c-31ced01d9f7f-ovn-rundir\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.826917 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.828559 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49863f24-1603-49e2-835c-31ced01d9f7f-combined-ca-bundle\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.837910 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wbvr\" (UniqueName: \"kubernetes.io/projected/49863f24-1603-49e2-835c-31ced01d9f7f-kube-api-access-7wbvr\") pod \"ovn-controller-metrics-wnm29\" (UID: \"49863f24-1603-49e2-835c-31ced01d9f7f\") " pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.884936 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wnm29" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921512 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921746 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921815 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921879 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzlfn\" (UniqueName: \"kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921920 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4twg\" (UniqueName: \"kubernetes.io/projected/d11a38a9-30c1-44d2-81ca-965f0dfbde96-kube-api-access-b4twg\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921952 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.921993 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-scripts\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.922033 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.922066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-config\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.922117 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.922164 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.923449 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.924600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.925388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.925738 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.925762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.926303 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-scripts\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.929163 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a38a9-30c1-44d2-81ca-965f0dfbde96-config\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.931669 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.933796 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.935797 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11a38a9-30c1-44d2-81ca-965f0dfbde96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.943367 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzlfn\" (UniqueName: \"kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn\") pod \"dnsmasq-dns-8554648995-s4c94\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:56 crc kubenswrapper[4954]: I1127 16:57:56.954932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4twg\" (UniqueName: \"kubernetes.io/projected/d11a38a9-30c1-44d2-81ca-965f0dfbde96-kube-api-access-b4twg\") pod \"ovn-northd-0\" (UID: \"d11a38a9-30c1-44d2-81ca-965f0dfbde96\") " pod="openstack/ovn-northd-0" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.040335 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.047218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.145994 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.163097 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.328856 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config\") pod \"eb30983b-ac35-4d76-b97e-2aaa70860a13\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.328930 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb\") pod \"eb30983b-ac35-4d76-b97e-2aaa70860a13\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.329016 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v62hf\" (UniqueName: \"kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf\") pod \"eb30983b-ac35-4d76-b97e-2aaa70860a13\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.329067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc\") pod \"eb30983b-ac35-4d76-b97e-2aaa70860a13\" (UID: \"eb30983b-ac35-4d76-b97e-2aaa70860a13\") " Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.329569 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config" (OuterVolumeSpecName: "config") pod "eb30983b-ac35-4d76-b97e-2aaa70860a13" (UID: "eb30983b-ac35-4d76-b97e-2aaa70860a13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.329874 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb30983b-ac35-4d76-b97e-2aaa70860a13" (UID: "eb30983b-ac35-4d76-b97e-2aaa70860a13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.330624 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb30983b-ac35-4d76-b97e-2aaa70860a13" (UID: "eb30983b-ac35-4d76-b97e-2aaa70860a13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.335543 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf" (OuterVolumeSpecName: "kube-api-access-v62hf") pod "eb30983b-ac35-4d76-b97e-2aaa70860a13" (UID: "eb30983b-ac35-4d76-b97e-2aaa70860a13"). InnerVolumeSpecName "kube-api-access-v62hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.408505 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wnm29"] Nov 27 16:57:57 crc kubenswrapper[4954]: W1127 16:57:57.414929 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49863f24_1603_49e2_835c_31ced01d9f7f.slice/crio-ce6f6cc21a410f53414a4c553468474c84fe7633032b7800c2eae0f0e458ac8e WatchSource:0}: Error finding container ce6f6cc21a410f53414a4c553468474c84fe7633032b7800c2eae0f0e458ac8e: Status 404 returned error can't find the container with id ce6f6cc21a410f53414a4c553468474c84fe7633032b7800c2eae0f0e458ac8e Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.430755 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.430782 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.430791 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v62hf\" (UniqueName: \"kubernetes.io/projected/eb30983b-ac35-4d76-b97e-2aaa70860a13-kube-api-access-v62hf\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.430870 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb30983b-ac35-4d76-b97e-2aaa70860a13-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.562670 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.571308 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.876595 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.943622 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.987975 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:57:57 crc kubenswrapper[4954]: I1127 16:57:57.989186 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.013085 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.147531 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.147658 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.147722 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.147817 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2nt\" (UniqueName: \"kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.147865 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.158472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wnm29" event={"ID":"49863f24-1603-49e2-835c-31ced01d9f7f","Type":"ContainerStarted","Data":"3e85eada3882e23589a32f804addb593dab6172a2e2f2876155d390ae658d6ff"} Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.158548 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wnm29" event={"ID":"49863f24-1603-49e2-835c-31ced01d9f7f","Type":"ContainerStarted","Data":"ce6f6cc21a410f53414a4c553468474c84fe7633032b7800c2eae0f0e458ac8e"} Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.161274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d11a38a9-30c1-44d2-81ca-965f0dfbde96","Type":"ContainerStarted","Data":"dc818cbb4eea2302b3c33443fa8c4f0cb731c9f7d23d43f15cc3921ea430a45a"} Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.165310 4954 generic.go:334] "Generic (PLEG): container finished" podID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" containerID="b593431ae1b0897eb8dae35a0c727ad7d7f28b11b73da1a89b0bc6dd8a18fb1c" exitCode=0 Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.165421 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-bgvc6" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.165629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s4c94" event={"ID":"b0f43cc8-5d12-4705-9ef8-06e5e03f7147","Type":"ContainerDied","Data":"b593431ae1b0897eb8dae35a0c727ad7d7f28b11b73da1a89b0bc6dd8a18fb1c"} Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.165706 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s4c94" event={"ID":"b0f43cc8-5d12-4705-9ef8-06e5e03f7147","Type":"ContainerStarted","Data":"db9395ed8b5fc1fa78a31d3cc4fca09eb81aca4384389e2a60acb473a5b22e14"} Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.198894 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wnm29" podStartSLOduration=2.198870058 podStartE2EDuration="2.198870058s" podCreationTimestamp="2025-11-27 16:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:57:58.178912433 +0000 UTC m=+1190.196352743" watchObservedRunningTime="2025-11-27 16:57:58.198870058 +0000 UTC m=+1190.216310348" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.250067 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2nt\" (UniqueName: \"kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.250197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.250241 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.250338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.250444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.253566 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.254402 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.254828 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.256275 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.264939 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-bgvc6"] Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.275471 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2nt\" (UniqueName: \"kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt\") pod \"dnsmasq-dns-b8fbc5445-g7xsr\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.280398 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-bgvc6"] Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.316047 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:57:58 crc kubenswrapper[4954]: E1127 16:57:58.631789 4954 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 27 16:57:58 crc kubenswrapper[4954]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b0f43cc8-5d12-4705-9ef8-06e5e03f7147/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 27 16:57:58 crc kubenswrapper[4954]: > podSandboxID="db9395ed8b5fc1fa78a31d3cc4fca09eb81aca4384389e2a60acb473a5b22e14" Nov 27 16:57:58 crc kubenswrapper[4954]: E1127 16:57:58.632450 4954 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 27 16:57:58 crc kubenswrapper[4954]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzlfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-s4c94_openstack(b0f43cc8-5d12-4705-9ef8-06e5e03f7147): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b0f43cc8-5d12-4705-9ef8-06e5e03f7147/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 27 16:57:58 crc kubenswrapper[4954]: > logger="UnhandledError" Nov 27 16:57:58 crc kubenswrapper[4954]: E1127 16:57:58.633639 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b0f43cc8-5d12-4705-9ef8-06e5e03f7147/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-s4c94" podUID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.683136 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb30983b-ac35-4d76-b97e-2aaa70860a13" path="/var/lib/kubelet/pods/eb30983b-ac35-4d76-b97e-2aaa70860a13/volumes" Nov 27 16:57:58 crc kubenswrapper[4954]: I1127 16:57:58.962893 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.026834 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.033220 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.037842 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-j5hhv" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.038055 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.038181 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.049363 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.054190 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.172544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-lock\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.173019 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvcj\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-kube-api-access-xxvcj\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.173040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-cache\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.173096 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.173141 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.203752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" event={"ID":"957f20f3-9f5d-4342-a3db-9c5b726bdb5d","Type":"ContainerStarted","Data":"52fe4fa05ef4e55560b49f1b85fda37697c46f33c8e69950292481e2c87b82a2"} Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.274293 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.274399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.274432 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-lock\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.274472 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxvcj\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-kube-api-access-xxvcj\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.274489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-cache\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.274975 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.275003 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.275072 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift podName:cff965fb-87ef-40a5-9dff-7d10d74cc09c nodeName:}" failed. No retries permitted until 2025-11-27 16:57:59.775049581 +0000 UTC m=+1191.792489881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift") pod "swift-storage-0" (UID: "cff965fb-87ef-40a5-9dff-7d10d74cc09c") : configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.275980 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-lock\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.276094 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cff965fb-87ef-40a5-9dff-7d10d74cc09c-cache\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.276415 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.298772 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxvcj\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-kube-api-access-xxvcj\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.310303 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.481487 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.552472 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cbrqw"] Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.556764 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" containerName="init" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.556796 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" containerName="init" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.561572 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" containerName="init" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.562478 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.578606 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.579009 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.579410 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.581394 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb\") pod \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.581528 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb\") pod \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.581713 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config\") pod \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.581783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc\") pod \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.581841 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzlfn\" (UniqueName: \"kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn\") pod \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\" (UID: \"b0f43cc8-5d12-4705-9ef8-06e5e03f7147\") " Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.588126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn" (OuterVolumeSpecName: "kube-api-access-lzlfn") pod "b0f43cc8-5d12-4705-9ef8-06e5e03f7147" (UID: "b0f43cc8-5d12-4705-9ef8-06e5e03f7147"). InnerVolumeSpecName "kube-api-access-lzlfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.604093 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cbrqw"] Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.631834 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0f43cc8-5d12-4705-9ef8-06e5e03f7147" (UID: "b0f43cc8-5d12-4705-9ef8-06e5e03f7147"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.642513 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config" (OuterVolumeSpecName: "config") pod "b0f43cc8-5d12-4705-9ef8-06e5e03f7147" (UID: "b0f43cc8-5d12-4705-9ef8-06e5e03f7147"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.648728 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0f43cc8-5d12-4705-9ef8-06e5e03f7147" (UID: "b0f43cc8-5d12-4705-9ef8-06e5e03f7147"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.650668 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0f43cc8-5d12-4705-9ef8-06e5e03f7147" (UID: "b0f43cc8-5d12-4705-9ef8-06e5e03f7147"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzb5l\" (UniqueName: \"kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684627 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684687 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684705 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684802 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684872 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684883 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684892 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684900 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.684909 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzlfn\" (UniqueName: \"kubernetes.io/projected/b0f43cc8-5d12-4705-9ef8-06e5e03f7147-kube-api-access-lzlfn\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786564 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786642 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786687 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzb5l\" (UniqueName: \"kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786764 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786845 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.786860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.787498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.788406 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.788421 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: E1127 16:57:59.788453 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift podName:cff965fb-87ef-40a5-9dff-7d10d74cc09c nodeName:}" failed. No retries permitted until 2025-11-27 16:58:00.788441532 +0000 UTC m=+1192.805881832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift") pod "swift-storage-0" (UID: "cff965fb-87ef-40a5-9dff-7d10d74cc09c") : configmap "swift-ring-files" not found Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.789152 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.790622 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.793089 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.793640 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.794075 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.807127 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzb5l\" (UniqueName: \"kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l\") pod \"swift-ring-rebalance-cbrqw\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:57:59 crc kubenswrapper[4954]: I1127 16:57:59.887288 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.134547 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cbrqw"] Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.212319 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d11a38a9-30c1-44d2-81ca-965f0dfbde96","Type":"ContainerStarted","Data":"a4fb07f6984a7bb80661c4bd85a925cf9b10c8addbbca8951c73c7bd7502c030"} Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.212371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d11a38a9-30c1-44d2-81ca-965f0dfbde96","Type":"ContainerStarted","Data":"03e2cbe167e8df97a82d512f8cf059e8cfd555dd0ca976dc0eba25c65f29cb08"} Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.212735 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.214425 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cbrqw" event={"ID":"794c6bdd-2ec7-458f-99ed-23383a740479","Type":"ContainerStarted","Data":"e9e9a321a2bd1508013538edcaf8adfc547d66a520883400ebd2c60bb5de5a62"} Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.217162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s4c94" event={"ID":"b0f43cc8-5d12-4705-9ef8-06e5e03f7147","Type":"ContainerDied","Data":"db9395ed8b5fc1fa78a31d3cc4fca09eb81aca4384389e2a60acb473a5b22e14"} Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.217200 4954 scope.go:117] "RemoveContainer" containerID="b593431ae1b0897eb8dae35a0c727ad7d7f28b11b73da1a89b0bc6dd8a18fb1c" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.217217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s4c94" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.218941 4954 generic.go:334] "Generic (PLEG): container finished" podID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerID="4ced774e6da3ae575b60ee12144c696a4177873894cdbd011bf6f8dc37592ae9" exitCode=0 Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.218993 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" event={"ID":"957f20f3-9f5d-4342-a3db-9c5b726bdb5d","Type":"ContainerDied","Data":"4ced774e6da3ae575b60ee12144c696a4177873894cdbd011bf6f8dc37592ae9"} Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.239692 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.7279598910000002 podStartE2EDuration="4.239669542s" podCreationTimestamp="2025-11-27 16:57:56 +0000 UTC" firstStartedPulling="2025-11-27 16:57:57.582138395 +0000 UTC m=+1189.599578695" lastFinishedPulling="2025-11-27 16:57:59.093848046 +0000 UTC m=+1191.111288346" observedRunningTime="2025-11-27 16:58:00.237777466 +0000 UTC m=+1192.255217846" watchObservedRunningTime="2025-11-27 16:58:00.239669542 +0000 UTC m=+1192.257109852" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.328449 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.335249 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s4c94"] Nov 27 16:58:00 crc kubenswrapper[4954]: E1127 16:58:00.454467 4954 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.204:56980->38.102.83.204:42783: write tcp 38.102.83.204:56980->38.102.83.204:42783: write: broken pipe Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.674024 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f43cc8-5d12-4705-9ef8-06e5e03f7147" path="/var/lib/kubelet/pods/b0f43cc8-5d12-4705-9ef8-06e5e03f7147/volumes" Nov 27 16:58:00 crc kubenswrapper[4954]: I1127 16:58:00.807166 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:58:00 crc kubenswrapper[4954]: E1127 16:58:00.807540 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:58:00 crc kubenswrapper[4954]: E1127 16:58:00.807622 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:58:00 crc kubenswrapper[4954]: E1127 16:58:00.807730 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift podName:cff965fb-87ef-40a5-9dff-7d10d74cc09c nodeName:}" failed. No retries permitted until 2025-11-27 16:58:02.807689331 +0000 UTC m=+1194.825129671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift") pod "swift-storage-0" (UID: "cff965fb-87ef-40a5-9dff-7d10d74cc09c") : configmap "swift-ring-files" not found Nov 27 16:58:01 crc kubenswrapper[4954]: I1127 16:58:01.231042 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" event={"ID":"957f20f3-9f5d-4342-a3db-9c5b726bdb5d","Type":"ContainerStarted","Data":"d069ff1d843550024346892f6f2b2b9b0fe694b5aa2f0c06ae719b8277540afa"} Nov 27 16:58:01 crc kubenswrapper[4954]: I1127 16:58:01.262225 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" podStartSLOduration=4.2621815 podStartE2EDuration="4.2621815s" podCreationTimestamp="2025-11-27 16:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:01.260293935 +0000 UTC m=+1193.277734265" watchObservedRunningTime="2025-11-27 16:58:01.2621815 +0000 UTC m=+1193.279621800" Nov 27 16:58:02 crc kubenswrapper[4954]: I1127 16:58:02.240651 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:58:02 crc kubenswrapper[4954]: I1127 16:58:02.865300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:58:02 crc kubenswrapper[4954]: E1127 16:58:02.865655 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:58:02 crc kubenswrapper[4954]: E1127 16:58:02.865713 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:58:02 crc kubenswrapper[4954]: E1127 16:58:02.865818 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift podName:cff965fb-87ef-40a5-9dff-7d10d74cc09c nodeName:}" failed. No retries permitted until 2025-11-27 16:58:06.865783745 +0000 UTC m=+1198.883224085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift") pod "swift-storage-0" (UID: "cff965fb-87ef-40a5-9dff-7d10d74cc09c") : configmap "swift-ring-files" not found Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.115730 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.116446 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.185856 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.269232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cbrqw" event={"ID":"794c6bdd-2ec7-458f-99ed-23383a740479","Type":"ContainerStarted","Data":"969fd1bc59c3807c8debb16763312223deb9ce4795afd5377909adf143cb5682"} Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.285204 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cbrqw" podStartSLOduration=2.057530526 podStartE2EDuration="5.285185423s" podCreationTimestamp="2025-11-27 16:57:59 +0000 UTC" firstStartedPulling="2025-11-27 16:58:00.1408565 +0000 UTC m=+1192.158296810" lastFinishedPulling="2025-11-27 16:58:03.368511407 +0000 UTC m=+1195.385951707" observedRunningTime="2025-11-27 16:58:04.284529907 +0000 UTC m=+1196.301970207" watchObservedRunningTime="2025-11-27 16:58:04.285185423 +0000 UTC m=+1196.302625713" Nov 27 16:58:04 crc kubenswrapper[4954]: I1127 16:58:04.340788 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.606967 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f6f0-account-create-update-jlfdn"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.608066 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.611245 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.625548 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f6f0-account-create-update-jlfdn"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.639192 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.639225 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.651116 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mmfbq"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.652249 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.690589 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mmfbq"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.728810 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwx4\" (UniqueName: \"kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.728895 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjkl\" (UniqueName: \"kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.728931 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.728954 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.839959 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.840161 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwx4\" (UniqueName: \"kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.840202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhjkl\" (UniqueName: \"kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.840230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.841164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.841465 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.869259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhjkl\" (UniqueName: \"kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl\") pod \"keystone-db-create-mmfbq\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.884153 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwx4\" (UniqueName: \"kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4\") pod \"keystone-f6f0-account-create-update-jlfdn\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.897419 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.916521 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pk6xr"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.919440 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.929763 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pk6xr"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.944447 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.984966 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1e64-account-create-update-g7spt"] Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.986324 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:05 crc kubenswrapper[4954]: I1127 16:58:05.992566 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.005500 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e64-account-create-update-g7spt"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.036982 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.045994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frk5c\" (UniqueName: \"kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.046035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.046074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.046129 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g2th\" (UniqueName: \"kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.150081 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g2th\" (UniqueName: \"kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.150515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frk5c\" (UniqueName: \"kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.150539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.150571 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.154206 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.154998 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.174894 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dqrlj"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.177435 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.187833 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g2th\" (UniqueName: \"kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th\") pod \"placement-1e64-account-create-update-g7spt\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.189997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frk5c\" (UniqueName: \"kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c\") pod \"placement-db-create-pk6xr\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.202794 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dqrlj"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.240459 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ae31-account-create-update-xs8vj"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.241680 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.246649 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ae31-account-create-update-xs8vj"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.248039 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.252049 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6nj\" (UniqueName: \"kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.252169 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.269095 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.366726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt55b\" (UniqueName: \"kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.366860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6nj\" (UniqueName: \"kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.366891 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.366936 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.368723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.388226 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6nj\" (UniqueName: \"kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj\") pod \"glance-db-create-dqrlj\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.399954 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.419042 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f6f0-account-create-update-jlfdn"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.420859 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.480717 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.481054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt55b\" (UniqueName: \"kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.482053 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.498017 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.506477 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt55b\" (UniqueName: \"kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b\") pod \"glance-ae31-account-create-update-xs8vj\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.561431 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.579004 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pk6xr"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.605124 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mmfbq"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.907653 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:58:06 crc kubenswrapper[4954]: E1127 16:58:06.908299 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:58:06 crc kubenswrapper[4954]: E1127 16:58:06.908315 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:58:06 crc kubenswrapper[4954]: E1127 16:58:06.908371 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift podName:cff965fb-87ef-40a5-9dff-7d10d74cc09c nodeName:}" failed. No retries permitted until 2025-11-27 16:58:14.908347174 +0000 UTC m=+1206.925787474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift") pod "swift-storage-0" (UID: "cff965fb-87ef-40a5-9dff-7d10d74cc09c") : configmap "swift-ring-files" not found Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.974286 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ae31-account-create-update-xs8vj"] Nov 27 16:58:06 crc kubenswrapper[4954]: I1127 16:58:06.983650 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e64-account-create-update-g7spt"] Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.050366 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dqrlj"] Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.301435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f6f0-account-create-update-jlfdn" event={"ID":"bf7e3231-2480-4075-80dc-0f44cc159964","Type":"ContainerStarted","Data":"ef7cb6324e37e0f74cb1580c1addb8ed96937c4f8b62f85ec8198a04c2fc51d5"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.301933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f6f0-account-create-update-jlfdn" event={"ID":"bf7e3231-2480-4075-80dc-0f44cc159964","Type":"ContainerStarted","Data":"beabf647120dd74a660396be2356c3334b7f641c648430e706461cb840db53a7"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.305607 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqrlj" event={"ID":"060ee5fd-88d7-4172-8196-ffeeb09be3b6","Type":"ContainerStarted","Data":"be363bef120adf9f5adb7037fda23cb69a45f34b961a108f05b1df9e40c5f404"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.307790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e64-account-create-update-g7spt" event={"ID":"cf895a7e-aada-4a88-814e-1a6b38ff6616","Type":"ContainerStarted","Data":"64d027316d8174dc51a9c48a1a0e86ca4641a47329148957bc641ce61b045ca2"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.311166 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pk6xr" event={"ID":"fdd20293-bf3f-44be-b18d-d6053638d393","Type":"ContainerStarted","Data":"270984d6e913433dd553a4e4f306d7220f48cb90b9e51da0c0e03535c2dfc081"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.311224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pk6xr" event={"ID":"fdd20293-bf3f-44be-b18d-d6053638d393","Type":"ContainerStarted","Data":"bc399e5876ad1f0d0200ccfad87bb88ce80b7beb01b51c4d1ee4c1984c910114"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.344951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmfbq" event={"ID":"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1","Type":"ContainerStarted","Data":"1c47d1e6187a7cc94e3378e5dca8ae078b8e60905b259c2e5e67ca6cb3f05fa3"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.345024 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmfbq" event={"ID":"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1","Type":"ContainerStarted","Data":"106544185d9be0bcca15e307e016380e23601e6a8d812dc6de61a358177714dd"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.345412 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-pk6xr" podStartSLOduration=2.345383379 podStartE2EDuration="2.345383379s" podCreationTimestamp="2025-11-27 16:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:07.345194254 +0000 UTC m=+1199.362634554" watchObservedRunningTime="2025-11-27 16:58:07.345383379 +0000 UTC m=+1199.362823689" Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.349418 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ae31-account-create-update-xs8vj" event={"ID":"a4f1d5b5-69ba-453c-90cc-85210e24e5d3","Type":"ContainerStarted","Data":"0857734bcee123ca8afd137e6e9effa1a01d9a609595e7a7d11fe204b2e1c92f"} Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.389859 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f6f0-account-create-update-jlfdn" podStartSLOduration=2.389813749 podStartE2EDuration="2.389813749s" podCreationTimestamp="2025-11-27 16:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:07.320137476 +0000 UTC m=+1199.337577776" watchObservedRunningTime="2025-11-27 16:58:07.389813749 +0000 UTC m=+1199.407254049" Nov 27 16:58:07 crc kubenswrapper[4954]: I1127 16:58:07.408153 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-mmfbq" podStartSLOduration=2.408132364 podStartE2EDuration="2.408132364s" podCreationTimestamp="2025-11-27 16:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:07.364874743 +0000 UTC m=+1199.382315063" watchObservedRunningTime="2025-11-27 16:58:07.408132364 +0000 UTC m=+1199.425572664" Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.318471 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.367923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e64-account-create-update-g7spt" event={"ID":"cf895a7e-aada-4a88-814e-1a6b38ff6616","Type":"ContainerStarted","Data":"9fab585b26cdd5fc6d56ebc649e3e35945e283fe38823ea96461b370ae5e58e2"} Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.374464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ae31-account-create-update-xs8vj" event={"ID":"a4f1d5b5-69ba-453c-90cc-85210e24e5d3","Type":"ContainerStarted","Data":"393d06451177be61fcf0672ff85912ca2b9d32b28a5cac8fa99cfba9333a0d29"} Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.379680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqrlj" event={"ID":"060ee5fd-88d7-4172-8196-ffeeb09be3b6","Type":"ContainerStarted","Data":"1b6a8b850d7dfa55bc34e5fd934ca1b713624d490e0696e38242d855913c0504"} Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.411678 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.411979 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" containerID="cri-o://3ad4afc8238191c70f11816ed77f9229615fb78f43db6c58aa1b400f777eb469" gracePeriod=10 Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.416821 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-1e64-account-create-update-g7spt" podStartSLOduration=3.416794896 podStartE2EDuration="3.416794896s" podCreationTimestamp="2025-11-27 16:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:08.387180756 +0000 UTC m=+1200.404621076" watchObservedRunningTime="2025-11-27 16:58:08.416794896 +0000 UTC m=+1200.434235196" Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.428017 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-dqrlj" podStartSLOduration=2.427991918 podStartE2EDuration="2.427991918s" podCreationTimestamp="2025-11-27 16:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:08.4132555 +0000 UTC m=+1200.430695810" watchObservedRunningTime="2025-11-27 16:58:08.427991918 +0000 UTC m=+1200.445432218" Nov 27 16:58:08 crc kubenswrapper[4954]: I1127 16:58:08.451865 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-ae31-account-create-update-xs8vj" podStartSLOduration=2.451839158 podStartE2EDuration="2.451839158s" podCreationTimestamp="2025-11-27 16:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:08.447907372 +0000 UTC m=+1200.465347672" watchObservedRunningTime="2025-11-27 16:58:08.451839158 +0000 UTC m=+1200.469279458" Nov 27 16:58:11 crc kubenswrapper[4954]: I1127 16:58:11.418365 4954 generic.go:334] "Generic (PLEG): container finished" podID="644e4b3e-4237-4179-8775-63cde7f94338" containerID="3ad4afc8238191c70f11816ed77f9229615fb78f43db6c58aa1b400f777eb469" exitCode=0 Nov 27 16:58:11 crc kubenswrapper[4954]: I1127 16:58:11.418460 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" event={"ID":"644e4b3e-4237-4179-8775-63cde7f94338","Type":"ContainerDied","Data":"3ad4afc8238191c70f11816ed77f9229615fb78f43db6c58aa1b400f777eb469"} Nov 27 16:58:11 crc kubenswrapper[4954]: I1127 16:58:11.774003 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Nov 27 16:58:12 crc kubenswrapper[4954]: I1127 16:58:12.130016 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 27 16:58:12 crc kubenswrapper[4954]: I1127 16:58:12.430240 4954 generic.go:334] "Generic (PLEG): container finished" podID="4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" containerID="1c47d1e6187a7cc94e3378e5dca8ae078b8e60905b259c2e5e67ca6cb3f05fa3" exitCode=0 Nov 27 16:58:12 crc kubenswrapper[4954]: I1127 16:58:12.430289 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmfbq" event={"ID":"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1","Type":"ContainerDied","Data":"1c47d1e6187a7cc94e3378e5dca8ae078b8e60905b259c2e5e67ca6cb3f05fa3"} Nov 27 16:58:13 crc kubenswrapper[4954]: I1127 16:58:13.440927 4954 generic.go:334] "Generic (PLEG): container finished" podID="060ee5fd-88d7-4172-8196-ffeeb09be3b6" containerID="1b6a8b850d7dfa55bc34e5fd934ca1b713624d490e0696e38242d855913c0504" exitCode=0 Nov 27 16:58:13 crc kubenswrapper[4954]: I1127 16:58:13.441041 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqrlj" event={"ID":"060ee5fd-88d7-4172-8196-ffeeb09be3b6","Type":"ContainerDied","Data":"1b6a8b850d7dfa55bc34e5fd934ca1b713624d490e0696e38242d855913c0504"} Nov 27 16:58:13 crc kubenswrapper[4954]: I1127 16:58:13.444023 4954 generic.go:334] "Generic (PLEG): container finished" podID="fdd20293-bf3f-44be-b18d-d6053638d393" containerID="270984d6e913433dd553a4e4f306d7220f48cb90b9e51da0c0e03535c2dfc081" exitCode=0 Nov 27 16:58:13 crc kubenswrapper[4954]: I1127 16:58:13.444099 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pk6xr" event={"ID":"fdd20293-bf3f-44be-b18d-d6053638d393","Type":"ContainerDied","Data":"270984d6e913433dd553a4e4f306d7220f48cb90b9e51da0c0e03535c2dfc081"} Nov 27 16:58:14 crc kubenswrapper[4954]: I1127 16:58:14.989775 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:58:15 crc kubenswrapper[4954]: I1127 16:58:15.000095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cff965fb-87ef-40a5-9dff-7d10d74cc09c-etc-swift\") pod \"swift-storage-0\" (UID: \"cff965fb-87ef-40a5-9dff-7d10d74cc09c\") " pod="openstack/swift-storage-0" Nov 27 16:58:15 crc kubenswrapper[4954]: I1127 16:58:15.162475 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 16:58:16 crc kubenswrapper[4954]: I1127 16:58:16.481704 4954 generic.go:334] "Generic (PLEG): container finished" podID="794c6bdd-2ec7-458f-99ed-23383a740479" containerID="969fd1bc59c3807c8debb16763312223deb9ce4795afd5377909adf143cb5682" exitCode=0 Nov 27 16:58:16 crc kubenswrapper[4954]: I1127 16:58:16.481772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cbrqw" event={"ID":"794c6bdd-2ec7-458f-99ed-23383a740479","Type":"ContainerDied","Data":"969fd1bc59c3807c8debb16763312223deb9ce4795afd5377909adf143cb5682"} Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.496815 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pk6xr" event={"ID":"fdd20293-bf3f-44be-b18d-d6053638d393","Type":"ContainerDied","Data":"bc399e5876ad1f0d0200ccfad87bb88ce80b7beb01b51c4d1ee4c1984c910114"} Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.496931 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc399e5876ad1f0d0200ccfad87bb88ce80b7beb01b51c4d1ee4c1984c910114" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.500649 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmfbq" event={"ID":"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1","Type":"ContainerDied","Data":"106544185d9be0bcca15e307e016380e23601e6a8d812dc6de61a358177714dd"} Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.500707 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106544185d9be0bcca15e307e016380e23601e6a8d812dc6de61a358177714dd" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.503875 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" event={"ID":"644e4b3e-4237-4179-8775-63cde7f94338","Type":"ContainerDied","Data":"cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507"} Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.503930 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb9dee9a04e9a22ad068a436703b681fbfd09db74cd7b15910a868b28145507" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.505791 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dqrlj" event={"ID":"060ee5fd-88d7-4172-8196-ffeeb09be3b6","Type":"ContainerDied","Data":"be363bef120adf9f5adb7037fda23cb69a45f34b961a108f05b1df9e40c5f404"} Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.506175 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be363bef120adf9f5adb7037fda23cb69a45f34b961a108f05b1df9e40c5f404" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.608652 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.608769 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.625446 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.626168 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740217 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkblm\" (UniqueName: \"kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm\") pod \"644e4b3e-4237-4179-8775-63cde7f94338\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frk5c\" (UniqueName: \"kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c\") pod \"fdd20293-bf3f-44be-b18d-d6053638d393\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740307 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts\") pod \"fdd20293-bf3f-44be-b18d-d6053638d393\" (UID: \"fdd20293-bf3f-44be-b18d-d6053638d393\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740374 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts\") pod \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740456 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config\") pod \"644e4b3e-4237-4179-8775-63cde7f94338\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740505 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc\") pod \"644e4b3e-4237-4179-8775-63cde7f94338\" (UID: \"644e4b3e-4237-4179-8775-63cde7f94338\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740549 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp6nj\" (UniqueName: \"kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj\") pod \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740567 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts\") pod \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\" (UID: \"060ee5fd-88d7-4172-8196-ffeeb09be3b6\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.740658 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhjkl\" (UniqueName: \"kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl\") pod \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\" (UID: \"4b5522ab-dc06-4c46-8a1b-fa7d94b058e1\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.741326 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdd20293-bf3f-44be-b18d-d6053638d393" (UID: "fdd20293-bf3f-44be-b18d-d6053638d393"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.741364 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" (UID: "4b5522ab-dc06-4c46-8a1b-fa7d94b058e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.742317 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "060ee5fd-88d7-4172-8196-ffeeb09be3b6" (UID: "060ee5fd-88d7-4172-8196-ffeeb09be3b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.748892 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm" (OuterVolumeSpecName: "kube-api-access-rkblm") pod "644e4b3e-4237-4179-8775-63cde7f94338" (UID: "644e4b3e-4237-4179-8775-63cde7f94338"). InnerVolumeSpecName "kube-api-access-rkblm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.748938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl" (OuterVolumeSpecName: "kube-api-access-zhjkl") pod "4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" (UID: "4b5522ab-dc06-4c46-8a1b-fa7d94b058e1"). InnerVolumeSpecName "kube-api-access-zhjkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.766879 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c" (OuterVolumeSpecName: "kube-api-access-frk5c") pod "fdd20293-bf3f-44be-b18d-d6053638d393" (UID: "fdd20293-bf3f-44be-b18d-d6053638d393"). InnerVolumeSpecName "kube-api-access-frk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.766945 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj" (OuterVolumeSpecName: "kube-api-access-cp6nj") pod "060ee5fd-88d7-4172-8196-ffeeb09be3b6" (UID: "060ee5fd-88d7-4172-8196-ffeeb09be3b6"). InnerVolumeSpecName "kube-api-access-cp6nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.785094 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config" (OuterVolumeSpecName: "config") pod "644e4b3e-4237-4179-8775-63cde7f94338" (UID: "644e4b3e-4237-4179-8775-63cde7f94338"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.810650 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "644e4b3e-4237-4179-8775-63cde7f94338" (UID: "644e4b3e-4237-4179-8775-63cde7f94338"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.842939 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhjkl\" (UniqueName: \"kubernetes.io/projected/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-kube-api-access-zhjkl\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.842981 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkblm\" (UniqueName: \"kubernetes.io/projected/644e4b3e-4237-4179-8775-63cde7f94338-kube-api-access-rkblm\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.842994 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frk5c\" (UniqueName: \"kubernetes.io/projected/fdd20293-bf3f-44be-b18d-d6053638d393-kube-api-access-frk5c\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843008 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdd20293-bf3f-44be-b18d-d6053638d393-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843020 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843034 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843047 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/644e4b3e-4237-4179-8775-63cde7f94338-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843059 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp6nj\" (UniqueName: \"kubernetes.io/projected/060ee5fd-88d7-4172-8196-ffeeb09be3b6-kube-api-access-cp6nj\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.843071 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060ee5fd-88d7-4172-8196-ffeeb09be3b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.856975 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.944996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945413 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945520 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945547 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzb5l\" (UniqueName: \"kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945597 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945632 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.945690 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts\") pod \"794c6bdd-2ec7-458f-99ed-23383a740479\" (UID: \"794c6bdd-2ec7-458f-99ed-23383a740479\") " Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.946014 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.946522 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.947184 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.953933 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l" (OuterVolumeSpecName: "kube-api-access-dzb5l") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "kube-api-access-dzb5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.954771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.968756 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.971122 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts" (OuterVolumeSpecName: "scripts") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:17 crc kubenswrapper[4954]: I1127 16:58:17.978262 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794c6bdd-2ec7-458f-99ed-23383a740479" (UID: "794c6bdd-2ec7-458f-99ed-23383a740479"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048518 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048563 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048603 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzb5l\" (UniqueName: \"kubernetes.io/projected/794c6bdd-2ec7-458f-99ed-23383a740479-kube-api-access-dzb5l\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048617 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/794c6bdd-2ec7-458f-99ed-23383a740479-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048629 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/794c6bdd-2ec7-458f-99ed-23383a740479-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.048639 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/794c6bdd-2ec7-458f-99ed-23383a740479-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.081151 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:58:18 crc kubenswrapper[4954]: W1127 16:58:18.091008 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcff965fb_87ef_40a5_9dff_7d10d74cc09c.slice/crio-5410d6e6744188aa34cbd3e4e0f67b419ff150b0595d504d07903136fd2ab976 WatchSource:0}: Error finding container 5410d6e6744188aa34cbd3e4e0f67b419ff150b0595d504d07903136fd2ab976: Status 404 returned error can't find the container with id 5410d6e6744188aa34cbd3e4e0f67b419ff150b0595d504d07903136fd2ab976 Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.519023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cbrqw" event={"ID":"794c6bdd-2ec7-458f-99ed-23383a740479","Type":"ContainerDied","Data":"e9e9a321a2bd1508013538edcaf8adfc547d66a520883400ebd2c60bb5de5a62"} Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.519098 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e9a321a2bd1508013538edcaf8adfc547d66a520883400ebd2c60bb5de5a62" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.519163 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cbrqw" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.521103 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmfbq" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.521656 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"5410d6e6744188aa34cbd3e4e0f67b419ff150b0595d504d07903136fd2ab976"} Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.521794 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.521867 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pk6xr" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.521798 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dqrlj" Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.622032 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.629276 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nnk5z"] Nov 27 16:58:18 crc kubenswrapper[4954]: I1127 16:58:18.678486 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644e4b3e-4237-4179-8775-63cde7f94338" path="/var/lib/kubelet/pods/644e4b3e-4237-4179-8775-63cde7f94338/volumes" Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.529367 4954 generic.go:334] "Generic (PLEG): container finished" podID="a4f1d5b5-69ba-453c-90cc-85210e24e5d3" containerID="393d06451177be61fcf0672ff85912ca2b9d32b28a5cac8fa99cfba9333a0d29" exitCode=0 Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.530614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ae31-account-create-update-xs8vj" event={"ID":"a4f1d5b5-69ba-453c-90cc-85210e24e5d3","Type":"ContainerDied","Data":"393d06451177be61fcf0672ff85912ca2b9d32b28a5cac8fa99cfba9333a0d29"} Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.532720 4954 generic.go:334] "Generic (PLEG): container finished" podID="bf7e3231-2480-4075-80dc-0f44cc159964" containerID="ef7cb6324e37e0f74cb1580c1addb8ed96937c4f8b62f85ec8198a04c2fc51d5" exitCode=0 Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.533114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f6f0-account-create-update-jlfdn" event={"ID":"bf7e3231-2480-4075-80dc-0f44cc159964","Type":"ContainerDied","Data":"ef7cb6324e37e0f74cb1580c1addb8ed96937c4f8b62f85ec8198a04c2fc51d5"} Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.534922 4954 generic.go:334] "Generic (PLEG): container finished" podID="cf895a7e-aada-4a88-814e-1a6b38ff6616" containerID="9fab585b26cdd5fc6d56ebc649e3e35945e283fe38823ea96461b370ae5e58e2" exitCode=0 Nov 27 16:58:19 crc kubenswrapper[4954]: I1127 16:58:19.535039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e64-account-create-update-g7spt" event={"ID":"cf895a7e-aada-4a88-814e-1a6b38ff6616","Type":"ContainerDied","Data":"9fab585b26cdd5fc6d56ebc649e3e35945e283fe38823ea96461b370ae5e58e2"} Nov 27 16:58:20 crc kubenswrapper[4954]: I1127 16:58:20.548943 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"e90d4fae1f235b14e9c4ddf9454374b22e3b2e2daf7ec5000d33935855441e84"} Nov 27 16:58:20 crc kubenswrapper[4954]: I1127 16:58:20.549514 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"8e3d4017e61b496278b271c668be6a97cb433145be53a376e7c33e6386d01ebc"} Nov 27 16:58:20 crc kubenswrapper[4954]: I1127 16:58:20.549540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"bb42b06541ed8b8204397cc9f8566f89b0a744e176ed980f2381c60c1947d604"} Nov 27 16:58:20 crc kubenswrapper[4954]: I1127 16:58:20.549561 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"d8458767abf4af440913b657edde41b769fe7daa3079a7c6c87b5bd9cb2adaf8"} Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.053009 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.058075 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.067787 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.110652 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts\") pod \"cf895a7e-aada-4a88-814e-1a6b38ff6616\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.110998 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g2th\" (UniqueName: \"kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th\") pod \"cf895a7e-aada-4a88-814e-1a6b38ff6616\" (UID: \"cf895a7e-aada-4a88-814e-1a6b38ff6616\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.111126 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts\") pod \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.111238 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts\") pod \"bf7e3231-2480-4075-80dc-0f44cc159964\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.111360 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt55b\" (UniqueName: \"kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b\") pod \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\" (UID: \"a4f1d5b5-69ba-453c-90cc-85210e24e5d3\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.111461 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djwx4\" (UniqueName: \"kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4\") pod \"bf7e3231-2480-4075-80dc-0f44cc159964\" (UID: \"bf7e3231-2480-4075-80dc-0f44cc159964\") " Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.114314 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf895a7e-aada-4a88-814e-1a6b38ff6616" (UID: "cf895a7e-aada-4a88-814e-1a6b38ff6616"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.114846 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf7e3231-2480-4075-80dc-0f44cc159964" (UID: "bf7e3231-2480-4075-80dc-0f44cc159964"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.115368 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4f1d5b5-69ba-453c-90cc-85210e24e5d3" (UID: "a4f1d5b5-69ba-453c-90cc-85210e24e5d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.118325 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th" (OuterVolumeSpecName: "kube-api-access-2g2th") pod "cf895a7e-aada-4a88-814e-1a6b38ff6616" (UID: "cf895a7e-aada-4a88-814e-1a6b38ff6616"). InnerVolumeSpecName "kube-api-access-2g2th". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.118998 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b" (OuterVolumeSpecName: "kube-api-access-zt55b") pod "a4f1d5b5-69ba-453c-90cc-85210e24e5d3" (UID: "a4f1d5b5-69ba-453c-90cc-85210e24e5d3"). InnerVolumeSpecName "kube-api-access-zt55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.124013 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4" (OuterVolumeSpecName: "kube-api-access-djwx4") pod "bf7e3231-2480-4075-80dc-0f44cc159964" (UID: "bf7e3231-2480-4075-80dc-0f44cc159964"). InnerVolumeSpecName "kube-api-access-djwx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213401 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt55b\" (UniqueName: \"kubernetes.io/projected/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-kube-api-access-zt55b\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213431 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djwx4\" (UniqueName: \"kubernetes.io/projected/bf7e3231-2480-4075-80dc-0f44cc159964-kube-api-access-djwx4\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213441 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf895a7e-aada-4a88-814e-1a6b38ff6616-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213499 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g2th\" (UniqueName: \"kubernetes.io/projected/cf895a7e-aada-4a88-814e-1a6b38ff6616-kube-api-access-2g2th\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213510 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f1d5b5-69ba-453c-90cc-85210e24e5d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.213520 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7e3231-2480-4075-80dc-0f44cc159964-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.559486 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ae31-account-create-update-xs8vj" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.559486 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ae31-account-create-update-xs8vj" event={"ID":"a4f1d5b5-69ba-453c-90cc-85210e24e5d3","Type":"ContainerDied","Data":"0857734bcee123ca8afd137e6e9effa1a01d9a609595e7a7d11fe204b2e1c92f"} Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.559659 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0857734bcee123ca8afd137e6e9effa1a01d9a609595e7a7d11fe204b2e1c92f" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.564271 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f6f0-account-create-update-jlfdn" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.564262 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f6f0-account-create-update-jlfdn" event={"ID":"bf7e3231-2480-4075-80dc-0f44cc159964","Type":"ContainerDied","Data":"beabf647120dd74a660396be2356c3334b7f641c648430e706461cb840db53a7"} Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.564353 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beabf647120dd74a660396be2356c3334b7f641c648430e706461cb840db53a7" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.569748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e64-account-create-update-g7spt" event={"ID":"cf895a7e-aada-4a88-814e-1a6b38ff6616","Type":"ContainerDied","Data":"64d027316d8174dc51a9c48a1a0e86ca4641a47329148957bc641ce61b045ca2"} Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.569785 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d027316d8174dc51a9c48a1a0e86ca4641a47329148957bc641ce61b045ca2" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.569816 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e64-account-create-update-g7spt" Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.737280 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7sc8" podUID="2a98905f-a2dd-4eb2-9a4f-437eb3626871" containerName="ovn-controller" probeResult="failure" output=< Nov 27 16:58:21 crc kubenswrapper[4954]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 27 16:58:21 crc kubenswrapper[4954]: > Nov 27 16:58:21 crc kubenswrapper[4954]: I1127 16:58:21.772815 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-nnk5z" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: i/o timeout" Nov 27 16:58:22 crc kubenswrapper[4954]: I1127 16:58:22.592818 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"2711e0e1da4179a80a953958b1a87a3fe4698433231a69ffde8c6fc70570cfca"} Nov 27 16:58:22 crc kubenswrapper[4954]: I1127 16:58:22.592874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"8d338b60f73a788c9d6b28d68205fc39090627aa9a14b55423e1b6489df9f96c"} Nov 27 16:58:22 crc kubenswrapper[4954]: I1127 16:58:22.592889 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"403cc5675da0f4fa661b448f1a2d31559895b13eae461741fa50dc2f68e37ed0"} Nov 27 16:58:22 crc kubenswrapper[4954]: I1127 16:58:22.592901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"b96e65f4d361abd6dae70dc6a16557bee32a1e5c935cfe2173c1205e0650febf"} Nov 27 16:58:23 crc kubenswrapper[4954]: I1127 16:58:23.687791 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:58:23 crc kubenswrapper[4954]: I1127 16:58:23.688388 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.623069 4954 generic.go:334] "Generic (PLEG): container finished" podID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerID="445f8d4ba8edbf32d835aee9867360a8ad19116e7317ee02107d314c316b88c3" exitCode=0 Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.623218 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerDied","Data":"445f8d4ba8edbf32d835aee9867360a8ad19116e7317ee02107d314c316b88c3"} Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.626539 4954 generic.go:334] "Generic (PLEG): container finished" podID="70949f64-380f-4947-a55a-8780126c7ba4" containerID="36df7c7eb591cb47cfc65798bc7acff77ecfcbcf7991a576639fa7c256680166" exitCode=0 Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.626624 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerDied","Data":"36df7c7eb591cb47cfc65798bc7acff77ecfcbcf7991a576639fa7c256680166"} Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.655828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"f3b122e78ce35e2b37b1a2a25941462b52f7e9a493354cb5267e4e20e13a1017"} Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.655876 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"4d98db5b051ebbe60fc72d80ed331dc0329b5e8415916f071267de7fe7e220ac"} Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.655892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"825efd177c2b261c6cd5b8bb991c79f0d4881fdb44978812dc024e33459ebf67"} Nov 27 16:58:24 crc kubenswrapper[4954]: I1127 16:58:24.655902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"f199b1f7da6691ea7a537ac82373fc4cef9a36a86105a327fd4e999c34832605"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.677686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"aa0f79bec7f6429be48051bec8a56a7aa8e06a5cfb78413824f60953eb72fd2f"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.678232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"df004ebe29ec5e28c66358a2249d0b9d9c9a347ac41c16b0030512f06856ff4e"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.678247 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cff965fb-87ef-40a5-9dff-7d10d74cc09c","Type":"ContainerStarted","Data":"d3b6c4efb650e59a711ada0f8b83a45565f0c391d4a69b8f4ae397402ca3bd3a"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.680736 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerStarted","Data":"8d9632c01a56fed6314fe20593b7890b1d092f7c59e288a6f9964fb7ca5853d4"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.681018 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.683698 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerStarted","Data":"88858638c29d0cad0c9c5ad394dc99b01a968705eada72845d7517cd148076b7"} Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.683883 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.719518 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.164055083 podStartE2EDuration="28.719496952s" podCreationTimestamp="2025-11-27 16:57:57 +0000 UTC" firstStartedPulling="2025-11-27 16:58:18.094056359 +0000 UTC m=+1210.111496659" lastFinishedPulling="2025-11-27 16:58:23.649498228 +0000 UTC m=+1215.666938528" observedRunningTime="2025-11-27 16:58:25.715246909 +0000 UTC m=+1217.732687209" watchObservedRunningTime="2025-11-27 16:58:25.719496952 +0000 UTC m=+1217.736937252" Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.750967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.567053286 podStartE2EDuration="1m4.750939576s" podCreationTimestamp="2025-11-27 16:57:21 +0000 UTC" firstStartedPulling="2025-11-27 16:57:40.591283059 +0000 UTC m=+1172.608723349" lastFinishedPulling="2025-11-27 16:57:48.775169339 +0000 UTC m=+1180.792609639" observedRunningTime="2025-11-27 16:58:25.743712831 +0000 UTC m=+1217.761153171" watchObservedRunningTime="2025-11-27 16:58:25.750939576 +0000 UTC m=+1217.768379916" Nov 27 16:58:25 crc kubenswrapper[4954]: I1127 16:58:25.777689 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.137553397 podStartE2EDuration="1m4.777660096s" podCreationTimestamp="2025-11-27 16:57:21 +0000 UTC" firstStartedPulling="2025-11-27 16:57:41.133097342 +0000 UTC m=+1173.150537642" lastFinishedPulling="2025-11-27 16:57:48.773204041 +0000 UTC m=+1180.790644341" observedRunningTime="2025-11-27 16:58:25.775879383 +0000 UTC m=+1217.793319683" watchObservedRunningTime="2025-11-27 16:58:25.777660096 +0000 UTC m=+1217.795100396" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.085854 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086643 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7e3231-2480-4075-80dc-0f44cc159964" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086663 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7e3231-2480-4075-80dc-0f44cc159964" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086683 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f1d5b5-69ba-453c-90cc-85210e24e5d3" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086692 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f1d5b5-69ba-453c-90cc-85210e24e5d3" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086705 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794c6bdd-2ec7-458f-99ed-23383a740479" containerName="swift-ring-rebalance" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086715 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="794c6bdd-2ec7-458f-99ed-23383a740479" containerName="swift-ring-rebalance" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086731 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086741 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086752 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086763 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086783 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="init" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086791 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="init" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086804 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060ee5fd-88d7-4172-8196-ffeeb09be3b6" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086813 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="060ee5fd-88d7-4172-8196-ffeeb09be3b6" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086822 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf895a7e-aada-4a88-814e-1a6b38ff6616" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086831 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf895a7e-aada-4a88-814e-1a6b38ff6616" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: E1127 16:58:26.086856 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd20293-bf3f-44be-b18d-d6053638d393" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.086864 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd20293-bf3f-44be-b18d-d6053638d393" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087067 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087081 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7e3231-2480-4075-80dc-0f44cc159964" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087098 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd20293-bf3f-44be-b18d-d6053638d393" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087114 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="060ee5fd-88d7-4172-8196-ffeeb09be3b6" containerName="mariadb-database-create" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087137 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf895a7e-aada-4a88-814e-1a6b38ff6616" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087157 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="644e4b3e-4237-4179-8775-63cde7f94338" containerName="dnsmasq-dns" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087166 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="794c6bdd-2ec7-458f-99ed-23383a740479" containerName="swift-ring-rebalance" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.087183 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f1d5b5-69ba-453c-90cc-85210e24e5d3" containerName="mariadb-account-create-update" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.088556 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.095025 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.105639 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210314 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55wv\" (UniqueName: \"kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210368 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210397 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210516 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210557 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.210596 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312368 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m55wv\" (UniqueName: \"kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312441 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.312458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.313320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.313866 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.314437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.314835 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.315420 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.343529 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55wv\" (UniqueName: \"kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv\") pod \"dnsmasq-dns-6d5b6d6b67-2dqhs\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.407250 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.549960 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wlq9c"] Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.551886 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.558170 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fxdj5" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.558391 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.598234 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wlq9c"] Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.720281 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hbpd\" (UniqueName: \"kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.720396 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.720421 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.720459 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.741967 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7sc8" podUID="2a98905f-a2dd-4eb2-9a4f-437eb3626871" containerName="ovn-controller" probeResult="failure" output=< Nov 27 16:58:26 crc kubenswrapper[4954]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 27 16:58:26 crc kubenswrapper[4954]: > Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.822334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.822657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hbpd\" (UniqueName: \"kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.822872 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.822914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.831328 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.832972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.846783 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hbpd\" (UniqueName: \"kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.848252 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.849353 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-btgpk" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.884095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data\") pod \"glance-db-sync-wlq9c\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.888077 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:26 crc kubenswrapper[4954]: I1127 16:58:26.946303 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.154859 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7sc8-config-nw5kn"] Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.156772 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.164998 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.195376 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7sc8-config-nw5kn"] Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.232391 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.232606 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.232745 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76j7n\" (UniqueName: \"kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.232839 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.232974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.233024 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334769 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334890 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76j7n\" (UniqueName: \"kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334926 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.334964 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.335480 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.335550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.336373 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.336796 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.337419 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.394734 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76j7n\" (UniqueName: \"kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n\") pod \"ovn-controller-s7sc8-config-nw5kn\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.404147 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wlq9c"] Nov 27 16:58:27 crc kubenswrapper[4954]: W1127 16:58:27.416811 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604569c5_bcdb_49ba_8fad_546903367900.slice/crio-b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462 WatchSource:0}: Error finding container b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462: Status 404 returned error can't find the container with id b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462 Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.480275 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.709744 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlq9c" event={"ID":"604569c5-bcdb-49ba-8fad-546903367900","Type":"ContainerStarted","Data":"b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462"} Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.712850 4954 generic.go:334] "Generic (PLEG): container finished" podID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerID="284ffcb58f2793927b7cfbeadc47ceddc51b61f20b2d224e333c92d3f400ae96" exitCode=0 Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.712907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" event={"ID":"54d94327-e1e4-4a52-89c2-d698ded5706f","Type":"ContainerDied","Data":"284ffcb58f2793927b7cfbeadc47ceddc51b61f20b2d224e333c92d3f400ae96"} Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.712979 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" event={"ID":"54d94327-e1e4-4a52-89c2-d698ded5706f","Type":"ContainerStarted","Data":"ba9ac8fe3f92d04f4385bd972bda359d4ed7bcd94a9fa254f216de7ec31ae5b6"} Nov 27 16:58:27 crc kubenswrapper[4954]: I1127 16:58:27.968001 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7sc8-config-nw5kn"] Nov 27 16:58:27 crc kubenswrapper[4954]: W1127 16:58:27.972909 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44773cc_2bd5_401e_8af0_3a22d8bc267f.slice/crio-a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09 WatchSource:0}: Error finding container a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09: Status 404 returned error can't find the container with id a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09 Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.726175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8-config-nw5kn" event={"ID":"c44773cc-2bd5-401e-8af0-3a22d8bc267f","Type":"ContainerStarted","Data":"0a51db165465237cd70da4ca6ba3d8a74d92122e18e8c571d1003572e6232564"} Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.726590 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8-config-nw5kn" event={"ID":"c44773cc-2bd5-401e-8af0-3a22d8bc267f","Type":"ContainerStarted","Data":"a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09"} Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.731695 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" event={"ID":"54d94327-e1e4-4a52-89c2-d698ded5706f","Type":"ContainerStarted","Data":"308a103851221e724cf470d87e60057e2aca64bf35d734eef3d3f8c91b3c939b"} Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.732452 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.789470 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" podStartSLOduration=2.789448885 podStartE2EDuration="2.789448885s" podCreationTimestamp="2025-11-27 16:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:28.784152626 +0000 UTC m=+1220.801592926" watchObservedRunningTime="2025-11-27 16:58:28.789448885 +0000 UTC m=+1220.806889185" Nov 27 16:58:28 crc kubenswrapper[4954]: I1127 16:58:28.795198 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s7sc8-config-nw5kn" podStartSLOduration=1.795183194 podStartE2EDuration="1.795183194s" podCreationTimestamp="2025-11-27 16:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:28.749999496 +0000 UTC m=+1220.767439796" watchObservedRunningTime="2025-11-27 16:58:28.795183194 +0000 UTC m=+1220.812623494" Nov 27 16:58:29 crc kubenswrapper[4954]: I1127 16:58:29.749790 4954 generic.go:334] "Generic (PLEG): container finished" podID="c44773cc-2bd5-401e-8af0-3a22d8bc267f" containerID="0a51db165465237cd70da4ca6ba3d8a74d92122e18e8c571d1003572e6232564" exitCode=0 Nov 27 16:58:29 crc kubenswrapper[4954]: I1127 16:58:29.753435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8-config-nw5kn" event={"ID":"c44773cc-2bd5-401e-8af0-3a22d8bc267f","Type":"ContainerDied","Data":"0a51db165465237cd70da4ca6ba3d8a74d92122e18e8c571d1003572e6232564"} Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.120433 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201188 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201383 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201454 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201492 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201527 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76j7n\" (UniqueName: \"kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201556 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run\") pod \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\" (UID: \"c44773cc-2bd5-401e-8af0-3a22d8bc267f\") " Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201768 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201812 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.201892 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run" (OuterVolumeSpecName: "var-run") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.202266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.202466 4954 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.202505 4954 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.202545 4954 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c44773cc-2bd5-401e-8af0-3a22d8bc267f-var-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.203825 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts" (OuterVolumeSpecName: "scripts") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.208125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n" (OuterVolumeSpecName: "kube-api-access-76j7n") pod "c44773cc-2bd5-401e-8af0-3a22d8bc267f" (UID: "c44773cc-2bd5-401e-8af0-3a22d8bc267f"). InnerVolumeSpecName "kube-api-access-76j7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.305121 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.305176 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76j7n\" (UniqueName: \"kubernetes.io/projected/c44773cc-2bd5-401e-8af0-3a22d8bc267f-kube-api-access-76j7n\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.305190 4954 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c44773cc-2bd5-401e-8af0-3a22d8bc267f-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.760657 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s7sc8" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.778486 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7sc8-config-nw5kn" event={"ID":"c44773cc-2bd5-401e-8af0-3a22d8bc267f","Type":"ContainerDied","Data":"a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09"} Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.778542 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a62aaed379f0801ce596537dbfa44e54ebae79a958bb4457768433fbfffa09" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.778629 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7sc8-config-nw5kn" Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.857650 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s7sc8-config-nw5kn"] Nov 27 16:58:31 crc kubenswrapper[4954]: I1127 16:58:31.866144 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s7sc8-config-nw5kn"] Nov 27 16:58:32 crc kubenswrapper[4954]: I1127 16:58:32.672915 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44773cc-2bd5-401e-8af0-3a22d8bc267f" path="/var/lib/kubelet/pods/c44773cc-2bd5-401e-8af0-3a22d8bc267f/volumes" Nov 27 16:58:36 crc kubenswrapper[4954]: I1127 16:58:36.408743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:58:36 crc kubenswrapper[4954]: I1127 16:58:36.471495 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:58:36 crc kubenswrapper[4954]: I1127 16:58:36.471746 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="dnsmasq-dns" containerID="cri-o://d069ff1d843550024346892f6f2b2b9b0fe694b5aa2f0c06ae719b8277540afa" gracePeriod=10 Nov 27 16:58:36 crc kubenswrapper[4954]: I1127 16:58:36.828127 4954 generic.go:334] "Generic (PLEG): container finished" podID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerID="d069ff1d843550024346892f6f2b2b9b0fe694b5aa2f0c06ae719b8277540afa" exitCode=0 Nov 27 16:58:36 crc kubenswrapper[4954]: I1127 16:58:36.828168 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" event={"ID":"957f20f3-9f5d-4342-a3db-9c5b726bdb5d","Type":"ContainerDied","Data":"d069ff1d843550024346892f6f2b2b9b0fe694b5aa2f0c06ae719b8277540afa"} Nov 27 16:58:38 crc kubenswrapper[4954]: I1127 16:58:38.317670 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.208119 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.272526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc\") pod \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.272634 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb\") pod \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.272764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb\") pod \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.272814 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2nt\" (UniqueName: \"kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt\") pod \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.272997 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config\") pod \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\" (UID: \"957f20f3-9f5d-4342-a3db-9c5b726bdb5d\") " Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.277967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt" (OuterVolumeSpecName: "kube-api-access-tl2nt") pod "957f20f3-9f5d-4342-a3db-9c5b726bdb5d" (UID: "957f20f3-9f5d-4342-a3db-9c5b726bdb5d"). InnerVolumeSpecName "kube-api-access-tl2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.315546 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config" (OuterVolumeSpecName: "config") pod "957f20f3-9f5d-4342-a3db-9c5b726bdb5d" (UID: "957f20f3-9f5d-4342-a3db-9c5b726bdb5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.325359 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "957f20f3-9f5d-4342-a3db-9c5b726bdb5d" (UID: "957f20f3-9f5d-4342-a3db-9c5b726bdb5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.331775 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "957f20f3-9f5d-4342-a3db-9c5b726bdb5d" (UID: "957f20f3-9f5d-4342-a3db-9c5b726bdb5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.332961 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "957f20f3-9f5d-4342-a3db-9c5b726bdb5d" (UID: "957f20f3-9f5d-4342-a3db-9c5b726bdb5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.380387 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.380442 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.380466 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.380501 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2nt\" (UniqueName: \"kubernetes.io/projected/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-kube-api-access-tl2nt\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.380520 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957f20f3-9f5d-4342-a3db-9c5b726bdb5d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.880011 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlq9c" event={"ID":"604569c5-bcdb-49ba-8fad-546903367900","Type":"ContainerStarted","Data":"993b586601c3b86b6ab6d17c37f96718cf6952b28e6d127c07e53c460d21cf9e"} Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.882704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" event={"ID":"957f20f3-9f5d-4342-a3db-9c5b726bdb5d","Type":"ContainerDied","Data":"52fe4fa05ef4e55560b49f1b85fda37697c46f33c8e69950292481e2c87b82a2"} Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.882773 4954 scope.go:117] "RemoveContainer" containerID="d069ff1d843550024346892f6f2b2b9b0fe694b5aa2f0c06ae719b8277540afa" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.882867 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g7xsr" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.908833 4954 scope.go:117] "RemoveContainer" containerID="4ced774e6da3ae575b60ee12144c696a4177873894cdbd011bf6f8dc37592ae9" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.913460 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wlq9c" podStartSLOduration=2.421444258 podStartE2EDuration="14.913430641s" podCreationTimestamp="2025-11-27 16:58:26 +0000 UTC" firstStartedPulling="2025-11-27 16:58:27.42029911 +0000 UTC m=+1219.437739410" lastFinishedPulling="2025-11-27 16:58:39.912285493 +0000 UTC m=+1231.929725793" observedRunningTime="2025-11-27 16:58:40.906112614 +0000 UTC m=+1232.923552934" watchObservedRunningTime="2025-11-27 16:58:40.913430641 +0000 UTC m=+1232.930870951" Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.939212 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:58:40 crc kubenswrapper[4954]: I1127 16:58:40.949934 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g7xsr"] Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.577848 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.681553 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" path="/var/lib/kubelet/pods/957f20f3-9f5d-4342-a3db-9c5b726bdb5d/volumes" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.906805 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.961878 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mhd4j"] Nov 27 16:58:42 crc kubenswrapper[4954]: E1127 16:58:42.962562 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44773cc-2bd5-401e-8af0-3a22d8bc267f" containerName="ovn-config" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.962787 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44773cc-2bd5-401e-8af0-3a22d8bc267f" containerName="ovn-config" Nov 27 16:58:42 crc kubenswrapper[4954]: E1127 16:58:42.962859 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="init" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.962874 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="init" Nov 27 16:58:42 crc kubenswrapper[4954]: E1127 16:58:42.962891 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="dnsmasq-dns" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.962902 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="dnsmasq-dns" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.963193 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44773cc-2bd5-401e-8af0-3a22d8bc267f" containerName="ovn-config" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.963251 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="957f20f3-9f5d-4342-a3db-9c5b726bdb5d" containerName="dnsmasq-dns" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.964575 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:42 crc kubenswrapper[4954]: I1127 16:58:42.986520 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mhd4j"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.035351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrmzb\" (UniqueName: \"kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.035533 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.058739 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4twzj"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.060101 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.078120 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4twzj"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.137139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.137224 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pmbb\" (UniqueName: \"kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.137278 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrmzb\" (UniqueName: \"kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.137297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.138023 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.178618 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-924f-account-create-update-qs4ck"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.182232 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.184501 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrmzb\" (UniqueName: \"kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb\") pod \"cinder-db-create-mhd4j\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.186979 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.193016 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-924f-account-create-update-qs4ck"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.245223 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pmbb\" (UniqueName: \"kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.245298 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.245409 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.249949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx4x4\" (UniqueName: \"kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.250563 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.281569 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pmbb\" (UniqueName: \"kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb\") pod \"barbican-db-create-4twzj\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.284741 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-06e3-account-create-update-gx6gr"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.287400 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.287848 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.299620 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-06e3-account-create-update-gx6gr"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.300022 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.353107 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx4x4\" (UniqueName: \"kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.353168 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.353272 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n247d\" (UniqueName: \"kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.353333 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.354209 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.373229 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx4x4\" (UniqueName: \"kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4\") pod \"barbican-924f-account-create-update-qs4ck\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.383684 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.438911 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c97tg"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.449125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.452426 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.457191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.457403 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n247d\" (UniqueName: \"kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.457721 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.457868 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cdxsk" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.458174 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.459026 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.464084 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c97tg"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.503472 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n247d\" (UniqueName: \"kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d\") pod \"cinder-06e3-account-create-update-gx6gr\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.558905 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6b26\" (UniqueName: \"kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.558973 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.558997 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.564207 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.585709 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-g8ngm"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.594467 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.643788 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-606e-account-create-update-qqqpz"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.645770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.649174 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.661229 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.661315 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pxsl\" (UniqueName: \"kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.661367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6b26\" (UniqueName: \"kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.661407 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.661432 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.671032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.673913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.694220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6b26\" (UniqueName: \"kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26\") pod \"keystone-db-sync-c97tg\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.704655 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g8ngm"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.724145 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-606e-account-create-update-qqqpz"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.756072 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.763705 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pxsl\" (UniqueName: \"kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.764232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.764274 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsf68\" (UniqueName: \"kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.764326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.765035 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.787179 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pxsl\" (UniqueName: \"kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl\") pod \"neutron-db-create-g8ngm\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.787202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c97tg" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.865814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.865869 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsf68\" (UniqueName: \"kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.866803 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.920559 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsf68\" (UniqueName: \"kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68\") pod \"neutron-606e-account-create-update-qqqpz\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.935669 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4twzj"] Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.949509 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.961155 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mhd4j"] Nov 27 16:58:43 crc kubenswrapper[4954]: W1127 16:58:43.982731 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56abe05c_60fe_4797_9b81_0ba5fa342149.slice/crio-8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6 WatchSource:0}: Error finding container 8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6: Status 404 returned error can't find the container with id 8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6 Nov 27 16:58:43 crc kubenswrapper[4954]: I1127 16:58:43.991405 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.289560 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-924f-account-create-update-qs4ck"] Nov 27 16:58:44 crc kubenswrapper[4954]: W1127 16:58:44.318436 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod958ac579_b5c6_47ae_9b39_13abfc4da1db.slice/crio-c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f WatchSource:0}: Error finding container c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f: Status 404 returned error can't find the container with id c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.413594 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-06e3-account-create-update-gx6gr"] Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.435525 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c97tg"] Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.747524 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g8ngm"] Nov 27 16:58:44 crc kubenswrapper[4954]: W1127 16:58:44.757174 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a8278f5_f0bb_4b86_b187_c8b047a338e3.slice/crio-b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302 WatchSource:0}: Error finding container b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302: Status 404 returned error can't find the container with id b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302 Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.831112 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-606e-account-create-update-qqqpz"] Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.948998 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c97tg" event={"ID":"243dbf8f-7ced-4de5-8c00-b205546b0db2","Type":"ContainerStarted","Data":"10b6493cc78a78f3a8ee056369bc1f7991d724d82fdffae089578059af8ec902"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.951455 4954 generic.go:334] "Generic (PLEG): container finished" podID="958ac579-b5c6-47ae-9b39-13abfc4da1db" containerID="986a0c88c524f676cfbcdfdedd620d2f20280129d7e5f624a37729159fd248d6" exitCode=0 Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.952000 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-924f-account-create-update-qs4ck" event={"ID":"958ac579-b5c6-47ae-9b39-13abfc4da1db","Type":"ContainerDied","Data":"986a0c88c524f676cfbcdfdedd620d2f20280129d7e5f624a37729159fd248d6"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.952027 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-924f-account-create-update-qs4ck" event={"ID":"958ac579-b5c6-47ae-9b39-13abfc4da1db","Type":"ContainerStarted","Data":"c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.953752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g8ngm" event={"ID":"0a8278f5-f0bb-4b86-b187-c8b047a338e3","Type":"ContainerStarted","Data":"b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.956159 4954 generic.go:334] "Generic (PLEG): container finished" podID="fe6f0251-00d1-460c-82fb-d86f5142c5f1" containerID="033d11af1c2e36e7d71877588fbe192027603aa475cde4ca986817a0a319fb5b" exitCode=0 Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.956259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhd4j" event={"ID":"fe6f0251-00d1-460c-82fb-d86f5142c5f1","Type":"ContainerDied","Data":"033d11af1c2e36e7d71877588fbe192027603aa475cde4ca986817a0a319fb5b"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.956290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhd4j" event={"ID":"fe6f0251-00d1-460c-82fb-d86f5142c5f1","Type":"ContainerStarted","Data":"1d12f876d520b3956eada7a4352964d03898a05d00f044855fd18fe1b8aaa1e2"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.957366 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-606e-account-create-update-qqqpz" event={"ID":"10709af5-22d7-4aaf-963a-c7b1a67d61db","Type":"ContainerStarted","Data":"183b05990fb78acbcd6f73bca4f2860d9f1bf1044e03616328b99fafefcc3bbf"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.958662 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-06e3-account-create-update-gx6gr" event={"ID":"2f849c06-6adb-4c74-b851-b261c6797f6b","Type":"ContainerStarted","Data":"0bfdf88c96edf13ce3404bb30d88ebf37431bbf997be6360e69e26ec3448ffc4"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.958694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-06e3-account-create-update-gx6gr" event={"ID":"2f849c06-6adb-4c74-b851-b261c6797f6b","Type":"ContainerStarted","Data":"206e9bb116e7899cc910143d21ac1da1b689287bd13663b4a3c7ebb5e314cc83"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.961014 4954 generic.go:334] "Generic (PLEG): container finished" podID="56abe05c-60fe-4797-9b81-0ba5fa342149" containerID="c4cd115b731192b87ccc81c1b09180efcd191973e1a160928b1d0aeeba85b8d6" exitCode=0 Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.961047 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4twzj" event={"ID":"56abe05c-60fe-4797-9b81-0ba5fa342149","Type":"ContainerDied","Data":"c4cd115b731192b87ccc81c1b09180efcd191973e1a160928b1d0aeeba85b8d6"} Nov 27 16:58:44 crc kubenswrapper[4954]: I1127 16:58:44.961063 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4twzj" event={"ID":"56abe05c-60fe-4797-9b81-0ba5fa342149","Type":"ContainerStarted","Data":"8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6"} Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.071685 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-06e3-account-create-update-gx6gr" podStartSLOduration=2.071663716 podStartE2EDuration="2.071663716s" podCreationTimestamp="2025-11-27 16:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:45.054713135 +0000 UTC m=+1237.072153465" watchObservedRunningTime="2025-11-27 16:58:45.071663716 +0000 UTC m=+1237.089104016" Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.974617 4954 generic.go:334] "Generic (PLEG): container finished" podID="0a8278f5-f0bb-4b86-b187-c8b047a338e3" containerID="7ddce387a5ede953ec464571c749c2df54c0bac4f4a37be7eea5b829fdaa5ffd" exitCode=0 Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.974715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g8ngm" event={"ID":"0a8278f5-f0bb-4b86-b187-c8b047a338e3","Type":"ContainerDied","Data":"7ddce387a5ede953ec464571c749c2df54c0bac4f4a37be7eea5b829fdaa5ffd"} Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.979825 4954 generic.go:334] "Generic (PLEG): container finished" podID="10709af5-22d7-4aaf-963a-c7b1a67d61db" containerID="e92a08eee1b7abd3fa9259d5653545521d191dd9dfb8bba8cf3bdb2de482da9a" exitCode=0 Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.979892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-606e-account-create-update-qqqpz" event={"ID":"10709af5-22d7-4aaf-963a-c7b1a67d61db","Type":"ContainerDied","Data":"e92a08eee1b7abd3fa9259d5653545521d191dd9dfb8bba8cf3bdb2de482da9a"} Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.983762 4954 generic.go:334] "Generic (PLEG): container finished" podID="2f849c06-6adb-4c74-b851-b261c6797f6b" containerID="0bfdf88c96edf13ce3404bb30d88ebf37431bbf997be6360e69e26ec3448ffc4" exitCode=0 Nov 27 16:58:45 crc kubenswrapper[4954]: I1127 16:58:45.983929 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-06e3-account-create-update-gx6gr" event={"ID":"2f849c06-6adb-4c74-b851-b261c6797f6b","Type":"ContainerDied","Data":"0bfdf88c96edf13ce3404bb30d88ebf37431bbf997be6360e69e26ec3448ffc4"} Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.443386 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.524819 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts\") pod \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.525058 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrmzb\" (UniqueName: \"kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb\") pod \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\" (UID: \"fe6f0251-00d1-460c-82fb-d86f5142c5f1\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.526863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe6f0251-00d1-460c-82fb-d86f5142c5f1" (UID: "fe6f0251-00d1-460c-82fb-d86f5142c5f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.527192 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.531078 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.536102 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb" (OuterVolumeSpecName: "kube-api-access-xrmzb") pod "fe6f0251-00d1-460c-82fb-d86f5142c5f1" (UID: "fe6f0251-00d1-460c-82fb-d86f5142c5f1"). InnerVolumeSpecName "kube-api-access-xrmzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.626735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pmbb\" (UniqueName: \"kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb\") pod \"56abe05c-60fe-4797-9b81-0ba5fa342149\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.626863 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts\") pod \"56abe05c-60fe-4797-9b81-0ba5fa342149\" (UID: \"56abe05c-60fe-4797-9b81-0ba5fa342149\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.626938 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx4x4\" (UniqueName: \"kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4\") pod \"958ac579-b5c6-47ae-9b39-13abfc4da1db\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627072 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts\") pod \"958ac579-b5c6-47ae-9b39-13abfc4da1db\" (UID: \"958ac579-b5c6-47ae-9b39-13abfc4da1db\") " Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56abe05c-60fe-4797-9b81-0ba5fa342149" (UID: "56abe05c-60fe-4797-9b81-0ba5fa342149"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627646 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "958ac579-b5c6-47ae-9b39-13abfc4da1db" (UID: "958ac579-b5c6-47ae-9b39-13abfc4da1db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627777 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrmzb\" (UniqueName: \"kubernetes.io/projected/fe6f0251-00d1-460c-82fb-d86f5142c5f1-kube-api-access-xrmzb\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627797 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56abe05c-60fe-4797-9b81-0ba5fa342149-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.627822 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe6f0251-00d1-460c-82fb-d86f5142c5f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.630000 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4" (OuterVolumeSpecName: "kube-api-access-lx4x4") pod "958ac579-b5c6-47ae-9b39-13abfc4da1db" (UID: "958ac579-b5c6-47ae-9b39-13abfc4da1db"). InnerVolumeSpecName "kube-api-access-lx4x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.631603 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb" (OuterVolumeSpecName: "kube-api-access-4pmbb") pod "56abe05c-60fe-4797-9b81-0ba5fa342149" (UID: "56abe05c-60fe-4797-9b81-0ba5fa342149"). InnerVolumeSpecName "kube-api-access-4pmbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.729118 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx4x4\" (UniqueName: \"kubernetes.io/projected/958ac579-b5c6-47ae-9b39-13abfc4da1db-kube-api-access-lx4x4\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.729179 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958ac579-b5c6-47ae-9b39-13abfc4da1db-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:46 crc kubenswrapper[4954]: I1127 16:58:46.729192 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pmbb\" (UniqueName: \"kubernetes.io/projected/56abe05c-60fe-4797-9b81-0ba5fa342149-kube-api-access-4pmbb\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.001416 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhd4j" event={"ID":"fe6f0251-00d1-460c-82fb-d86f5142c5f1","Type":"ContainerDied","Data":"1d12f876d520b3956eada7a4352964d03898a05d00f044855fd18fe1b8aaa1e2"} Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.001466 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d12f876d520b3956eada7a4352964d03898a05d00f044855fd18fe1b8aaa1e2" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.001501 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhd4j" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.002946 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4twzj" event={"ID":"56abe05c-60fe-4797-9b81-0ba5fa342149","Type":"ContainerDied","Data":"8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6"} Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.003010 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4d0f4fef187b190fdae9586b65879bde8dd00c02e43829bc83c4743252b6a6" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.003098 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4twzj" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.004704 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-924f-account-create-update-qs4ck" Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.005768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-924f-account-create-update-qs4ck" event={"ID":"958ac579-b5c6-47ae-9b39-13abfc4da1db","Type":"ContainerDied","Data":"c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f"} Nov 27 16:58:47 crc kubenswrapper[4954]: I1127 16:58:47.005793 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a840c4f7f798899825aa8bc9d730c02ab735d9afd2e1b440b9af8734f1cc8f" Nov 27 16:58:49 crc kubenswrapper[4954]: I1127 16:58:49.022515 4954 generic.go:334] "Generic (PLEG): container finished" podID="604569c5-bcdb-49ba-8fad-546903367900" containerID="993b586601c3b86b6ab6d17c37f96718cf6952b28e6d127c07e53c460d21cf9e" exitCode=0 Nov 27 16:58:49 crc kubenswrapper[4954]: I1127 16:58:49.022648 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlq9c" event={"ID":"604569c5-bcdb-49ba-8fad-546903367900","Type":"ContainerDied","Data":"993b586601c3b86b6ab6d17c37f96718cf6952b28e6d127c07e53c460d21cf9e"} Nov 27 16:58:50 crc kubenswrapper[4954]: E1127 16:58:50.341093 4954 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Nov 27 16:58:53 crc kubenswrapper[4954]: I1127 16:58:53.687039 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:58:53 crc kubenswrapper[4954]: I1127 16:58:53.687441 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:58:53 crc kubenswrapper[4954]: I1127 16:58:53.687487 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 16:58:53 crc kubenswrapper[4954]: I1127 16:58:53.688230 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:58:53 crc kubenswrapper[4954]: I1127 16:58:53.688337 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513" gracePeriod=600 Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.030713 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.060235 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.078915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pxsl\" (UniqueName: \"kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl\") pod \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.079098 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts\") pod \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\" (UID: \"0a8278f5-f0bb-4b86-b187-c8b047a338e3\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.084076 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a8278f5-f0bb-4b86-b187-c8b047a338e3" (UID: "0a8278f5-f0bb-4b86-b187-c8b047a338e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.091940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl" (OuterVolumeSpecName: "kube-api-access-7pxsl") pod "0a8278f5-f0bb-4b86-b187-c8b047a338e3" (UID: "0a8278f5-f0bb-4b86-b187-c8b047a338e3"). InnerVolumeSpecName "kube-api-access-7pxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.096250 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.100951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-06e3-account-create-update-gx6gr" event={"ID":"2f849c06-6adb-4c74-b851-b261c6797f6b","Type":"ContainerDied","Data":"206e9bb116e7899cc910143d21ac1da1b689287bd13663b4a3c7ebb5e314cc83"} Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.100993 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206e9bb116e7899cc910143d21ac1da1b689287bd13663b4a3c7ebb5e314cc83" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.106927 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513" exitCode=0 Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.106985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513"} Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.107042 4954 scope.go:117] "RemoveContainer" containerID="6a54903c8c633a0f68f9dab4e62025f22307496e9e210ed0a72c63ab1c8cd13b" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.107059 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.110424 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g8ngm" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.110550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g8ngm" event={"ID":"0a8278f5-f0bb-4b86-b187-c8b047a338e3","Type":"ContainerDied","Data":"b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302"} Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.110659 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b64e630f93e203eb8fb7d7b2f80cb28d6b0a720d804414a9db6fdaa6b0daf302" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.119967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-606e-account-create-update-qqqpz" event={"ID":"10709af5-22d7-4aaf-963a-c7b1a67d61db","Type":"ContainerDied","Data":"183b05990fb78acbcd6f73bca4f2860d9f1bf1044e03616328b99fafefcc3bbf"} Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.120015 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="183b05990fb78acbcd6f73bca4f2860d9f1bf1044e03616328b99fafefcc3bbf" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.120112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-606e-account-create-update-qqqpz" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.138039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlq9c" event={"ID":"604569c5-bcdb-49ba-8fad-546903367900","Type":"ContainerDied","Data":"b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462"} Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.138082 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3d5fe4dca0e1e32f0d28f70667af280b5536dad4db01a657168bb1df53c2462" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.138133 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlq9c" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180479 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hbpd\" (UniqueName: \"kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd\") pod \"604569c5-bcdb-49ba-8fad-546903367900\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts\") pod \"2f849c06-6adb-4c74-b851-b261c6797f6b\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180628 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle\") pod \"604569c5-bcdb-49ba-8fad-546903367900\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180700 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts\") pod \"10709af5-22d7-4aaf-963a-c7b1a67d61db\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180726 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsf68\" (UniqueName: \"kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68\") pod \"10709af5-22d7-4aaf-963a-c7b1a67d61db\" (UID: \"10709af5-22d7-4aaf-963a-c7b1a67d61db\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180815 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data\") pod \"604569c5-bcdb-49ba-8fad-546903367900\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180868 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n247d\" (UniqueName: \"kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d\") pod \"2f849c06-6adb-4c74-b851-b261c6797f6b\" (UID: \"2f849c06-6adb-4c74-b851-b261c6797f6b\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.180970 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data\") pod \"604569c5-bcdb-49ba-8fad-546903367900\" (UID: \"604569c5-bcdb-49ba-8fad-546903367900\") " Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.181300 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a8278f5-f0bb-4b86-b187-c8b047a338e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.181320 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pxsl\" (UniqueName: \"kubernetes.io/projected/0a8278f5-f0bb-4b86-b187-c8b047a338e3-kube-api-access-7pxsl\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.181983 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f849c06-6adb-4c74-b851-b261c6797f6b" (UID: "2f849c06-6adb-4c74-b851-b261c6797f6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.182275 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10709af5-22d7-4aaf-963a-c7b1a67d61db" (UID: "10709af5-22d7-4aaf-963a-c7b1a67d61db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.189267 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d" (OuterVolumeSpecName: "kube-api-access-n247d") pod "2f849c06-6adb-4c74-b851-b261c6797f6b" (UID: "2f849c06-6adb-4c74-b851-b261c6797f6b"). InnerVolumeSpecName "kube-api-access-n247d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.191069 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd" (OuterVolumeSpecName: "kube-api-access-4hbpd") pod "604569c5-bcdb-49ba-8fad-546903367900" (UID: "604569c5-bcdb-49ba-8fad-546903367900"). InnerVolumeSpecName "kube-api-access-4hbpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.193340 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68" (OuterVolumeSpecName: "kube-api-access-qsf68") pod "10709af5-22d7-4aaf-963a-c7b1a67d61db" (UID: "10709af5-22d7-4aaf-963a-c7b1a67d61db"). InnerVolumeSpecName "kube-api-access-qsf68". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.193432 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "604569c5-bcdb-49ba-8fad-546903367900" (UID: "604569c5-bcdb-49ba-8fad-546903367900"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.230506 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604569c5-bcdb-49ba-8fad-546903367900" (UID: "604569c5-bcdb-49ba-8fad-546903367900"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.251530 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data" (OuterVolumeSpecName: "config-data") pod "604569c5-bcdb-49ba-8fad-546903367900" (UID: "604569c5-bcdb-49ba-8fad-546903367900"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.282931 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n247d\" (UniqueName: \"kubernetes.io/projected/2f849c06-6adb-4c74-b851-b261c6797f6b-kube-api-access-n247d\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.282964 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.282974 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hbpd\" (UniqueName: \"kubernetes.io/projected/604569c5-bcdb-49ba-8fad-546903367900-kube-api-access-4hbpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.282984 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f849c06-6adb-4c74-b851-b261c6797f6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.282992 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.283002 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10709af5-22d7-4aaf-963a-c7b1a67d61db-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.283010 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsf68\" (UniqueName: \"kubernetes.io/projected/10709af5-22d7-4aaf-963a-c7b1a67d61db-kube-api-access-qsf68\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:54 crc kubenswrapper[4954]: I1127 16:58:54.283018 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604569c5-bcdb-49ba-8fad-546903367900-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.198560 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c97tg" event={"ID":"243dbf8f-7ced-4de5-8c00-b205546b0db2","Type":"ContainerStarted","Data":"04637028b8a8e2fa1452af4d1d1eb9b585777b52cae2187a2fde615588b12543"} Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.205074 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-06e3-account-create-update-gx6gr" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.205898 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f"} Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.249010 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c97tg" podStartSLOduration=2.839021189 podStartE2EDuration="12.248977819s" podCreationTimestamp="2025-11-27 16:58:43 +0000 UTC" firstStartedPulling="2025-11-27 16:58:44.480691558 +0000 UTC m=+1236.498131858" lastFinishedPulling="2025-11-27 16:58:53.890648188 +0000 UTC m=+1245.908088488" observedRunningTime="2025-11-27 16:58:55.221188883 +0000 UTC m=+1247.238629183" watchObservedRunningTime="2025-11-27 16:58:55.248977819 +0000 UTC m=+1247.266418119" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.603256 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.604750 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ac579-b5c6-47ae-9b39-13abfc4da1db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.604851 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ac579-b5c6-47ae-9b39-13abfc4da1db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.604921 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604569c5-bcdb-49ba-8fad-546903367900" containerName="glance-db-sync" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.604978 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="604569c5-bcdb-49ba-8fad-546903367900" containerName="glance-db-sync" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.605057 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8278f5-f0bb-4b86-b187-c8b047a338e3" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.605121 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8278f5-f0bb-4b86-b187-c8b047a338e3" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.605185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6f0251-00d1-460c-82fb-d86f5142c5f1" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.605249 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6f0251-00d1-460c-82fb-d86f5142c5f1" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.605456 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10709af5-22d7-4aaf-963a-c7b1a67d61db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.605522 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10709af5-22d7-4aaf-963a-c7b1a67d61db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.605599 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f849c06-6adb-4c74-b851-b261c6797f6b" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.605671 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f849c06-6adb-4c74-b851-b261c6797f6b" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: E1127 16:58:55.605757 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56abe05c-60fe-4797-9b81-0ba5fa342149" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.605820 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56abe05c-60fe-4797-9b81-0ba5fa342149" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606063 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6f0251-00d1-460c-82fb-d86f5142c5f1" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606134 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="56abe05c-60fe-4797-9b81-0ba5fa342149" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606200 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="604569c5-bcdb-49ba-8fad-546903367900" containerName="glance-db-sync" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606269 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f849c06-6adb-4c74-b851-b261c6797f6b" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606330 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="958ac579-b5c6-47ae-9b39-13abfc4da1db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606404 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="10709af5-22d7-4aaf-963a-c7b1a67d61db" containerName="mariadb-account-create-update" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.606469 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8278f5-f0bb-4b86-b187-c8b047a338e3" containerName="mariadb-database-create" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.610444 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.617792 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.708753 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.708820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.708891 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcdhq\" (UniqueName: \"kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.708966 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.709155 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.709302 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810430 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810484 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810525 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcdhq\" (UniqueName: \"kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.810566 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.811772 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.811883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.812015 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.812213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.812772 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.831212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcdhq\" (UniqueName: \"kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq\") pod \"dnsmasq-dns-895cf5cf-2kfhn\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:55 crc kubenswrapper[4954]: I1127 16:58:55.933119 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:56 crc kubenswrapper[4954]: I1127 16:58:56.383104 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:58:57 crc kubenswrapper[4954]: I1127 16:58:57.223006 4954 generic.go:334] "Generic (PLEG): container finished" podID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerID="be82d262028294176e62f3e3193cb076742798b1062866305b1d7decc0511685" exitCode=0 Nov 27 16:58:57 crc kubenswrapper[4954]: I1127 16:58:57.223084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" event={"ID":"da5fcc21-2130-46ca-ab19-fe735802b2af","Type":"ContainerDied","Data":"be82d262028294176e62f3e3193cb076742798b1062866305b1d7decc0511685"} Nov 27 16:58:57 crc kubenswrapper[4954]: I1127 16:58:57.223435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" event={"ID":"da5fcc21-2130-46ca-ab19-fe735802b2af","Type":"ContainerStarted","Data":"43eae4c7470a338f0c4e91a9be938c9f759239b63bd4d12749aa94b8c446dec5"} Nov 27 16:58:58 crc kubenswrapper[4954]: I1127 16:58:58.243310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" event={"ID":"da5fcc21-2130-46ca-ab19-fe735802b2af","Type":"ContainerStarted","Data":"d2d93c8a63c3684d94e550606598dfed2d64a0344ff48a6e5daab78661cb6bd6"} Nov 27 16:58:58 crc kubenswrapper[4954]: I1127 16:58:58.243991 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:58:58 crc kubenswrapper[4954]: I1127 16:58:58.271330 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" podStartSLOduration=3.271292264 podStartE2EDuration="3.271292264s" podCreationTimestamp="2025-11-27 16:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:58:58.265192405 +0000 UTC m=+1250.282632705" watchObservedRunningTime="2025-11-27 16:58:58.271292264 +0000 UTC m=+1250.288732604" Nov 27 16:59:00 crc kubenswrapper[4954]: I1127 16:59:00.262169 4954 generic.go:334] "Generic (PLEG): container finished" podID="243dbf8f-7ced-4de5-8c00-b205546b0db2" containerID="04637028b8a8e2fa1452af4d1d1eb9b585777b52cae2187a2fde615588b12543" exitCode=0 Nov 27 16:59:00 crc kubenswrapper[4954]: I1127 16:59:00.262303 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c97tg" event={"ID":"243dbf8f-7ced-4de5-8c00-b205546b0db2","Type":"ContainerDied","Data":"04637028b8a8e2fa1452af4d1d1eb9b585777b52cae2187a2fde615588b12543"} Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.661151 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c97tg" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.759174 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6b26\" (UniqueName: \"kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26\") pod \"243dbf8f-7ced-4de5-8c00-b205546b0db2\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.759241 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle\") pod \"243dbf8f-7ced-4de5-8c00-b205546b0db2\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.759531 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data\") pod \"243dbf8f-7ced-4de5-8c00-b205546b0db2\" (UID: \"243dbf8f-7ced-4de5-8c00-b205546b0db2\") " Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.766608 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26" (OuterVolumeSpecName: "kube-api-access-n6b26") pod "243dbf8f-7ced-4de5-8c00-b205546b0db2" (UID: "243dbf8f-7ced-4de5-8c00-b205546b0db2"). InnerVolumeSpecName "kube-api-access-n6b26". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.807628 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "243dbf8f-7ced-4de5-8c00-b205546b0db2" (UID: "243dbf8f-7ced-4de5-8c00-b205546b0db2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.824430 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data" (OuterVolumeSpecName: "config-data") pod "243dbf8f-7ced-4de5-8c00-b205546b0db2" (UID: "243dbf8f-7ced-4de5-8c00-b205546b0db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.863804 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6b26\" (UniqueName: \"kubernetes.io/projected/243dbf8f-7ced-4de5-8c00-b205546b0db2-kube-api-access-n6b26\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.863843 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:01 crc kubenswrapper[4954]: I1127 16:59:01.863854 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243dbf8f-7ced-4de5-8c00-b205546b0db2-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.291803 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c97tg" event={"ID":"243dbf8f-7ced-4de5-8c00-b205546b0db2","Type":"ContainerDied","Data":"10b6493cc78a78f3a8ee056369bc1f7991d724d82fdffae089578059af8ec902"} Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.292492 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b6493cc78a78f3a8ee056369bc1f7991d724d82fdffae089578059af8ec902" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.291946 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c97tg" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.565216 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.565496 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="dnsmasq-dns" containerID="cri-o://d2d93c8a63c3684d94e550606598dfed2d64a0344ff48a6e5daab78661cb6bd6" gracePeriod=10 Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.566780 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.736564 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:02 crc kubenswrapper[4954]: E1127 16:59:02.737806 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243dbf8f-7ced-4de5-8c00-b205546b0db2" containerName="keystone-db-sync" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.737826 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="243dbf8f-7ced-4de5-8c00-b205546b0db2" containerName="keystone-db-sync" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.738283 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="243dbf8f-7ced-4de5-8c00-b205546b0db2" containerName="keystone-db-sync" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.741607 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.759738 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ch2l7"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.762246 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.781113 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cdxsk" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.781464 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.781619 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.781906 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.782114 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.855645 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ch2l7"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.873471 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893594 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893632 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmllb\" (UniqueName: \"kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893714 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpmns\" (UniqueName: \"kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893731 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893752 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893785 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893823 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893865 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893889 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.893912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.904790 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x4n64"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.906168 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.913397 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.913631 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.917570 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nt8xt" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.928943 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x4n64"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.939873 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.953366 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.960853 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-9q6lt" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.961115 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.961279 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.967609 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.976976 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995607 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995711 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995761 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995783 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995818 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995844 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995866 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995887 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62km\" (UniqueName: \"kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmllb\" (UniqueName: \"kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpmns\" (UniqueName: \"kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.995981 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.996001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.996020 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.996036 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:02 crc kubenswrapper[4954]: I1127 16:59:02.996076 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.009266 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.009907 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.010180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.010420 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.012635 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.013494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.013673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.020732 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.022812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.023152 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.042428 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpmns\" (UniqueName: \"kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns\") pod \"dnsmasq-dns-6c9c9f998c-rjzgh\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.058337 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmllb\" (UniqueName: \"kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb\") pod \"keystone-bootstrap-ch2l7\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.065798 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hwpt7"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.067466 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.080603 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.083646 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.098496 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099130 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099241 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt87d\" (UniqueName: \"kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099317 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099335 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099374 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099414 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099493 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.099536 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62km\" (UniqueName: \"kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.106612 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.106801 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ng4f" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.107362 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.107478 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.108784 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.111280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.116664 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hwpt7"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.117436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.125312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.132833 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.144031 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.155925 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.163278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62km\" (UniqueName: \"kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km\") pod \"cinder-db-sync-x4n64\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201572 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvpn\" (UniqueName: \"kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201751 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201859 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201899 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201945 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201962 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnljs\" (UniqueName: \"kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.201983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.202005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.202034 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.202053 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt87d\" (UniqueName: \"kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.202076 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.202095 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.203427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.204314 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.205301 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.212321 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.212669 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.231128 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.243506 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x4n64" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.250893 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt87d\" (UniqueName: \"kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d\") pod \"horizon-855db5c9c7-gpqq9\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.254290 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.275050 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.301041 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6vl85"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.302569 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303532 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnljs\" (UniqueName: \"kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303562 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303613 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303640 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303662 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303787 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvpn\" (UniqueName: \"kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303825 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9pp\" (UniqueName: \"kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303926 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.303949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.314972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.315220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.320651 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.321254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.323137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.333561 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.334442 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nlnhq" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.338077 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.338324 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.338871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.343290 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.357309 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.386284 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnljs\" (UniqueName: \"kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs\") pod \"barbican-db-sync-hwpt7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.402554 4954 generic.go:334] "Generic (PLEG): container finished" podID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerID="d2d93c8a63c3684d94e550606598dfed2d64a0344ff48a6e5daab78661cb6bd6" exitCode=0 Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.402642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" event={"ID":"da5fcc21-2130-46ca-ab19-fe735802b2af","Type":"ContainerDied","Data":"d2d93c8a63c3684d94e550606598dfed2d64a0344ff48a6e5daab78661cb6bd6"} Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411199 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411285 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411336 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411425 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411536 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9pp\" (UniqueName: \"kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411693 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gmpt\" (UniqueName: \"kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.411719 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.412071 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.416386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.418043 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.418804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.420083 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6vl85"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.424500 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.424621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.446816 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvpn\" (UniqueName: \"kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn\") pod \"ceilometer-0\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.482446 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9pp\" (UniqueName: \"kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp\") pod \"dnsmasq-dns-57c957c4ff-5zrkl\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.513813 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.513873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.513914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gmpt\" (UniqueName: \"kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.513945 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.513983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.515056 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.520962 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.522292 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.535645 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fcrnt"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.546454 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.547291 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hwpt7" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.556389 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7mmjt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.556604 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.556744 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.563375 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.584145 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.589726 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.596468 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gmpt\" (UniqueName: \"kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt\") pod \"placement-db-sync-6vl85\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.616089 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.616762 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vrc5\" (UniqueName: \"kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.616821 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.633680 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fcrnt"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.646182 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.648120 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.654111 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.658246 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.658438 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.658542 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.688400 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fxdj5" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.690967 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6vl85" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.719394 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.727175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.731111 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.731992 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.732183 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.732337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vrc5\" (UniqueName: \"kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.732486 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.732701 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn8zt\" (UniqueName: \"kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.733112 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.733272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.734053 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.734231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.751521 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.751689 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.753418 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.753438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.798144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vrc5\" (UniqueName: \"kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5\") pod \"neutron-db-sync-fcrnt\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836812 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836851 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836874 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn8zt\" (UniqueName: \"kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836920 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836941 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836968 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.836988 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwdc\" (UniqueName: \"kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.837023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.837055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.837097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.837120 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.837673 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.839717 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.842250 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.846904 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.854814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.855328 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.856775 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.858470 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.858832 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.858925 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.862545 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.902735 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.904993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.917140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn8zt\" (UniqueName: \"kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt\") pod \"glance-default-external-api-0\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.918228 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938821 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938843 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwdc\" (UniqueName: \"kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938907 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938932 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938951 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938972 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.938994 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.939014 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.939063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.942167 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.942217 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42nlj\" (UniqueName: \"kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.942952 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.944417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.944670 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.964317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:03 crc kubenswrapper[4954]: I1127 16:59:03.981275 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwdc\" (UniqueName: \"kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc\") pod \"horizon-7755474f4f-2m4z8\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.028378 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ch2l7"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.037272 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043394 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043432 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42nlj\" (UniqueName: \"kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043485 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043560 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.043815 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.044959 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.046935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.046987 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.050320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.055423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.061660 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.070112 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42nlj\" (UniqueName: \"kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.083146 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.097926 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.101835 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.123291 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.177444 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.190828 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.353480 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.354059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.354119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcdhq\" (UniqueName: \"kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.354183 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.354233 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.354310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb\") pod \"da5fcc21-2130-46ca-ab19-fe735802b2af\" (UID: \"da5fcc21-2130-46ca-ab19-fe735802b2af\") " Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.370878 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq" (OuterVolumeSpecName: "kube-api-access-mcdhq") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "kube-api-access-mcdhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.422687 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x4n64"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.423548 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" event={"ID":"f197ffa8-e0e7-4af7-8928-be49b26cb0d9","Type":"ContainerStarted","Data":"1c6fe09433a6a5d29841c524e4d8c34eff2083d91ea94de28d709d7e58b764ea"} Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.429196 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch2l7" event={"ID":"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1","Type":"ContainerStarted","Data":"131536829c79cde11e2eb09337215d25b28cb3b9c5353df4b28f04da4d785aed"} Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.438763 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" event={"ID":"da5fcc21-2130-46ca-ab19-fe735802b2af","Type":"ContainerDied","Data":"43eae4c7470a338f0c4e91a9be938c9f759239b63bd4d12749aa94b8c446dec5"} Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.438822 4954 scope.go:117] "RemoveContainer" containerID="d2d93c8a63c3684d94e550606598dfed2d64a0344ff48a6e5daab78661cb6bd6" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.438841 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-2kfhn" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.457715 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcdhq\" (UniqueName: \"kubernetes.io/projected/da5fcc21-2130-46ca-ab19-fe735802b2af-kube-api-access-mcdhq\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.504626 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.509457 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.520293 4954 scope.go:117] "RemoveContainer" containerID="be82d262028294176e62f3e3193cb076742798b1062866305b1d7decc0511685" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.541306 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.548278 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.560008 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.560044 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.560058 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.560069 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.564617 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config" (OuterVolumeSpecName: "config") pod "da5fcc21-2130-46ca-ab19-fe735802b2af" (UID: "da5fcc21-2130-46ca-ab19-fe735802b2af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.667301 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5fcc21-2130-46ca-ab19-fe735802b2af-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.680251 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.727501 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hwpt7"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.736676 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.756208 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6vl85"] Nov 27 16:59:04 crc kubenswrapper[4954]: I1127 16:59:04.767253 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:04.991664 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.034545 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-2kfhn"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.177552 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.210814 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fcrnt"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.321424 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.457435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fcrnt" event={"ID":"50892b2e-4e6f-4794-bb8d-e649a9b223fc","Type":"ContainerStarted","Data":"a8e68d9e08b302521405e33205e88197c3552400ccc449a55bc9391074169cce"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.459617 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch2l7" event={"ID":"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1","Type":"ContainerStarted","Data":"53c366354a5cb400bc91690989142c15151c801937a32af334adb3754f67e604"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.461435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" event={"ID":"49b7b3ea-3919-4d95-9fc8-138aef12ee08","Type":"ContainerStarted","Data":"6dd53b7bf588b71cbbdb00b24280019d871d3f6087230ce5f1af0a1572857d0e"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.462861 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x4n64" event={"ID":"58c181b9-bc11-4747-84ad-5302f1265507","Type":"ContainerStarted","Data":"2131a8bd011c3498f4cef8b62b48ce44f4d909ffb59489c654c82c7153ea56a7"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.464320 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6vl85" event={"ID":"0123682b-b80c-436f-bf07-6252dc3df9bc","Type":"ContainerStarted","Data":"7d21d5643ed5890c6c848998060045b79a5d446d078556c7f8ca543a69003d1b"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.465989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" event={"ID":"f197ffa8-e0e7-4af7-8928-be49b26cb0d9","Type":"ContainerStarted","Data":"08a720fa4d026858cc3f8097fde00aea39fa8077dd77605d9879f12a3e6bae14"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.467126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerStarted","Data":"57903b95e99a85acaac662cc20c0c92d309e22e6b76fa0fc78eeb1be03e27191"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.468168 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerStarted","Data":"11ad4323a5e416ca719b04d6c01019c65e9ffa62165f9484cd91acb38d2f9534"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.470034 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hwpt7" event={"ID":"1bce3669-a584-4f00-8043-90be729c9fa7","Type":"ContainerStarted","Data":"0817d5cad9fbde486b69c154cc6a3499182ec885314c7f82f9a0854246a51daf"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.471500 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerStarted","Data":"aa71c878602ead6424f853ebde04dab25d210b4621eaaccc7043e43ef149d74e"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.473563 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerStarted","Data":"893e4b6b9f0db46d5f301dd77b2142ff779c7899525a862710260fac030a505b"} Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.680627 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.726380 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.777933 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.797270 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 16:59:05 crc kubenswrapper[4954]: E1127 16:59:05.798060 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="init" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.798099 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="init" Nov 27 16:59:05 crc kubenswrapper[4954]: E1127 16:59:05.798136 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="dnsmasq-dns" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.798144 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="dnsmasq-dns" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.798373 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" containerName="dnsmasq-dns" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.799626 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.821174 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.828946 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.900968 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.901204 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.901290 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74lwt\" (UniqueName: \"kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.901406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:05 crc kubenswrapper[4954]: I1127 16:59:05.901542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.005406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.005503 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.005560 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.005606 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74lwt\" (UniqueName: \"kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.005639 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.007303 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.009058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.012744 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.013798 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.027709 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lwt\" (UniqueName: \"kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt\") pod \"horizon-6d65d5b797-gbgfp\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.130139 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.233153 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.496454 4954 generic.go:334] "Generic (PLEG): container finished" podID="f197ffa8-e0e7-4af7-8928-be49b26cb0d9" containerID="08a720fa4d026858cc3f8097fde00aea39fa8077dd77605d9879f12a3e6bae14" exitCode=0 Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.496508 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" event={"ID":"f197ffa8-e0e7-4af7-8928-be49b26cb0d9","Type":"ContainerDied","Data":"08a720fa4d026858cc3f8097fde00aea39fa8077dd77605d9879f12a3e6bae14"} Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.502399 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fcrnt" event={"ID":"50892b2e-4e6f-4794-bb8d-e649a9b223fc","Type":"ContainerStarted","Data":"5a552d795c4a9f561604e4aa4659efec65258503d374da8b32c15c2f8f7c5d4b"} Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.539911 4954 generic.go:334] "Generic (PLEG): container finished" podID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerID="1a5f37be12affa844bec7c198093f616df1ebd9cfe7acaf1a6c1b5c5e1f6f4b2" exitCode=0 Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.539985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" event={"ID":"49b7b3ea-3919-4d95-9fc8-138aef12ee08","Type":"ContainerDied","Data":"1a5f37be12affa844bec7c198093f616df1ebd9cfe7acaf1a6c1b5c5e1f6f4b2"} Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.550764 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerStarted","Data":"306daab18b0c32ed664b70befcb8916f80006291c564625f8e2347e039125a38"} Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.557903 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fcrnt" podStartSLOduration=3.5578837979999998 podStartE2EDuration="3.557883798s" podCreationTimestamp="2025-11-27 16:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:06.555181412 +0000 UTC m=+1258.572621712" watchObservedRunningTime="2025-11-27 16:59:06.557883798 +0000 UTC m=+1258.575324098" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.578552 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ch2l7" podStartSLOduration=4.576564711 podStartE2EDuration="4.576564711s" podCreationTimestamp="2025-11-27 16:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:06.576066669 +0000 UTC m=+1258.593506969" watchObservedRunningTime="2025-11-27 16:59:06.576564711 +0000 UTC m=+1258.594005011" Nov 27 16:59:06 crc kubenswrapper[4954]: I1127 16:59:06.658236 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:06.694892 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5fcc21-2130-46ca-ab19-fe735802b2af" path="/var/lib/kubelet/pods/da5fcc21-2130-46ca-ab19-fe735802b2af/volumes" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:06.981325 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpmns\" (UniqueName: \"kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132200 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132244 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132347 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132411 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.132498 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb\") pod \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\" (UID: \"f197ffa8-e0e7-4af7-8928-be49b26cb0d9\") " Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.143363 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns" (OuterVolumeSpecName: "kube-api-access-dpmns") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "kube-api-access-dpmns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.162384 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.166382 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.167742 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.171448 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.176358 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config" (OuterVolumeSpecName: "config") pod "f197ffa8-e0e7-4af7-8928-be49b26cb0d9" (UID: "f197ffa8-e0e7-4af7-8928-be49b26cb0d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235387 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235422 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235432 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235442 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpmns\" (UniqueName: \"kubernetes.io/projected/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-kube-api-access-dpmns\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235452 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.235461 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f197ffa8-e0e7-4af7-8928-be49b26cb0d9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.573355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" event={"ID":"49b7b3ea-3919-4d95-9fc8-138aef12ee08","Type":"ContainerStarted","Data":"879f25bd4ee51940fc5b66b39fcc4ea7f2469e04728e79af3fff58c5b95a762c"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.573816 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.584109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerStarted","Data":"cb1a44616c475c69ee1fc69126fbdc2f38f8fd6b71246ee808a7299571e3eb1b"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.593082 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.593687 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rjzgh" event={"ID":"f197ffa8-e0e7-4af7-8928-be49b26cb0d9","Type":"ContainerDied","Data":"1c6fe09433a6a5d29841c524e4d8c34eff2083d91ea94de28d709d7e58b764ea"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.593768 4954 scope.go:117] "RemoveContainer" containerID="08a720fa4d026858cc3f8097fde00aea39fa8077dd77605d9879f12a3e6bae14" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.604651 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerStarted","Data":"176e892ba92bc2b4935c93bac62e5c457994b02c4c2c742e0a0a4e60e3da3400"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.608875 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerStarted","Data":"83dab37ab68eb7d46fd711aada3dcc209fb1c3ddb6eda650646f0d61c5986edc"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.611668 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" podStartSLOduration=5.611619884 podStartE2EDuration="5.611619884s" podCreationTimestamp="2025-11-27 16:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:07.608478807 +0000 UTC m=+1259.625919107" watchObservedRunningTime="2025-11-27 16:59:07.611619884 +0000 UTC m=+1259.629060184" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.705341 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:07.716345 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rjzgh"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.630039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerStarted","Data":"351efef5a5685c17294c51913270c83186600756acd928d1dc933af87a449277"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.634103 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerStarted","Data":"ae748553ab7db97b310cf20f562828ba5f390feb1c02d698361198f8d8f985d0"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.634200 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-log" containerID="cri-o://83dab37ab68eb7d46fd711aada3dcc209fb1c3ddb6eda650646f0d61c5986edc" gracePeriod=30 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.634321 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-httpd" containerID="cri-o://ae748553ab7db97b310cf20f562828ba5f390feb1c02d698361198f8d8f985d0" gracePeriod=30 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.665694 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.665661136 podStartE2EDuration="5.665661136s" podCreationTimestamp="2025-11-27 16:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:08.664494568 +0000 UTC m=+1260.681934888" watchObservedRunningTime="2025-11-27 16:59:08.665661136 +0000 UTC m=+1260.683101436" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:08.694860 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f197ffa8-e0e7-4af7-8928-be49b26cb0d9" path="/var/lib/kubelet/pods/f197ffa8-e0e7-4af7-8928-be49b26cb0d9/volumes" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:09.651341 4954 generic.go:334] "Generic (PLEG): container finished" podID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerID="83dab37ab68eb7d46fd711aada3dcc209fb1c3ddb6eda650646f0d61c5986edc" exitCode=143 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:09.651680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerDied","Data":"83dab37ab68eb7d46fd711aada3dcc209fb1c3ddb6eda650646f0d61c5986edc"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:09.651832 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-log" containerID="cri-o://cb1a44616c475c69ee1fc69126fbdc2f38f8fd6b71246ee808a7299571e3eb1b" gracePeriod=30 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:09.652272 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-httpd" containerID="cri-o://351efef5a5685c17294c51913270c83186600756acd928d1dc933af87a449277" gracePeriod=30 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:09.688471 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.68844607 podStartE2EDuration="6.68844607s" podCreationTimestamp="2025-11-27 16:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:09.68307917 +0000 UTC m=+1261.700519470" watchObservedRunningTime="2025-11-27 16:59:09.68844607 +0000 UTC m=+1261.705886370" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:10.665078 4954 generic.go:334] "Generic (PLEG): container finished" podID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerID="ae748553ab7db97b310cf20f562828ba5f390feb1c02d698361198f8d8f985d0" exitCode=0 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:10.668423 4954 generic.go:334] "Generic (PLEG): container finished" podID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerID="cb1a44616c475c69ee1fc69126fbdc2f38f8fd6b71246ee808a7299571e3eb1b" exitCode=143 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:10.673730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerDied","Data":"ae748553ab7db97b310cf20f562828ba5f390feb1c02d698361198f8d8f985d0"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:10.673763 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerDied","Data":"cb1a44616c475c69ee1fc69126fbdc2f38f8fd6b71246ee808a7299571e3eb1b"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:11.686568 4954 generic.go:334] "Generic (PLEG): container finished" podID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerID="351efef5a5685c17294c51913270c83186600756acd928d1dc933af87a449277" exitCode=0 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:11.686648 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerDied","Data":"351efef5a5685c17294c51913270c83186600756acd928d1dc933af87a449277"} Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.177800 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.218400 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 16:59:14 crc kubenswrapper[4954]: E1127 16:59:12.218953 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f197ffa8-e0e7-4af7-8928-be49b26cb0d9" containerName="init" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.218970 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f197ffa8-e0e7-4af7-8928-be49b26cb0d9" containerName="init" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.219186 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f197ffa8-e0e7-4af7-8928-be49b26cb0d9" containerName="init" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.222055 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.224596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.235905 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.265917 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.302821 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b5c6d8894-l7bzv"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.304437 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.337276 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5c6d8894-l7bzv"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362137 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362211 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362249 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362380 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhgz8\" (UniqueName: \"kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362508 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362593 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.362669 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.464449 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-combined-ca-bundle\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.464506 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-config-data\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.464885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.464955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465056 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-secret-key\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465380 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhgz8\" (UniqueName: \"kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-scripts\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465592 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11ddebaa-610a-410a-a161-a5b89d87eb75-logs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465640 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465642 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465665 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dxp\" (UniqueName: \"kubernetes.io/projected/11ddebaa-610a-410a-a161-a5b89d87eb75-kube-api-access-72dxp\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.465873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.467133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.470400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.470912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-tls-certs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.470990 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.481762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.485186 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhgz8\" (UniqueName: \"kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.485721 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.484167 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle\") pod \"horizon-6549c6cdd4-szxmh\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.542293 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.572822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-tls-certs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.572863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-combined-ca-bundle\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.572890 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-config-data\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.572950 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-secret-key\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.572982 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-scripts\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.573008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11ddebaa-610a-410a-a161-a5b89d87eb75-logs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.573026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72dxp\" (UniqueName: \"kubernetes.io/projected/11ddebaa-610a-410a-a161-a5b89d87eb75-kube-api-access-72dxp\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.573859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11ddebaa-610a-410a-a161-a5b89d87eb75-logs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.574611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-scripts\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.574765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11ddebaa-610a-410a-a161-a5b89d87eb75-config-data\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.577335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-tls-certs\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.577794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-combined-ca-bundle\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.577983 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11ddebaa-610a-410a-a161-a5b89d87eb75-horizon-secret-key\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.593070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dxp\" (UniqueName: \"kubernetes.io/projected/11ddebaa-610a-410a-a161-a5b89d87eb75-kube-api-access-72dxp\") pod \"horizon-b5c6d8894-l7bzv\" (UID: \"11ddebaa-610a-410a-a161-a5b89d87eb75\") " pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:12.633744 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:13.588101 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:13.655552 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:13.655934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" containerID="cri-o://308a103851221e724cf470d87e60057e2aca64bf35d734eef3d3f8c91b3c939b" gracePeriod=10 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:14.729497 4954 generic.go:334] "Generic (PLEG): container finished" podID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerID="308a103851221e724cf470d87e60057e2aca64bf35d734eef3d3f8c91b3c939b" exitCode=0 Nov 27 16:59:14 crc kubenswrapper[4954]: I1127 16:59:14.729952 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" event={"ID":"54d94327-e1e4-4a52-89c2-d698ded5706f","Type":"ContainerDied","Data":"308a103851221e724cf470d87e60057e2aca64bf35d734eef3d3f8c91b3c939b"} Nov 27 16:59:15 crc kubenswrapper[4954]: I1127 16:59:15.710227 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 16:59:15 crc kubenswrapper[4954]: I1127 16:59:15.719050 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5c6d8894-l7bzv"] Nov 27 16:59:16 crc kubenswrapper[4954]: I1127 16:59:16.755991 4954 generic.go:334] "Generic (PLEG): container finished" podID="98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" containerID="53c366354a5cb400bc91690989142c15151c801937a32af334adb3754f67e604" exitCode=0 Nov 27 16:59:16 crc kubenswrapper[4954]: I1127 16:59:16.756223 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch2l7" event={"ID":"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1","Type":"ContainerDied","Data":"53c366354a5cb400bc91690989142c15151c801937a32af334adb3754f67e604"} Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.105406 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.113483 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299598 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299684 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299796 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299868 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299908 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42nlj\" (UniqueName: \"kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.299956 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300056 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300197 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300251 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300287 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300316 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300351 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmllb\" (UniqueName: \"kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb\") pod \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\" (UID: \"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.300422 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\" (UID: \"7746fa7e-51b6-4a24-bd44-f455e06d7b79\") " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.301472 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.301500 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs" (OuterVolumeSpecName: "logs") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.307085 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.307385 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj" (OuterVolumeSpecName: "kube-api-access-42nlj") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "kube-api-access-42nlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.309650 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts" (OuterVolumeSpecName: "scripts") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.313189 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts" (OuterVolumeSpecName: "scripts") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.313168 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.313869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb" (OuterVolumeSpecName: "kube-api-access-fmllb") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "kube-api-access-fmllb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.315166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.336201 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data" (OuterVolumeSpecName: "config-data") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.359809 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" (UID: "98efb041-1cba-4ea8-b7ae-a84bec3ed2c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.360097 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.365652 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data" (OuterVolumeSpecName: "config-data") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.394861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7746fa7e-51b6-4a24-bd44-f455e06d7b79" (UID: "7746fa7e-51b6-4a24-bd44-f455e06d7b79"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.409167 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418135 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418251 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418275 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418299 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418325 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmllb\" (UniqueName: \"kubernetes.io/projected/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-kube-api-access-fmllb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418454 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418477 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418496 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418514 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7746fa7e-51b6-4a24-bd44-f455e06d7b79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418531 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418549 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42nlj\" (UniqueName: \"kubernetes.io/projected/7746fa7e-51b6-4a24-bd44-f455e06d7b79-kube-api-access-42nlj\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418567 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418610 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7746fa7e-51b6-4a24-bd44-f455e06d7b79-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.418632 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.438756 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.520925 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.818082 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch2l7" event={"ID":"98efb041-1cba-4ea8-b7ae-a84bec3ed2c1","Type":"ContainerDied","Data":"131536829c79cde11e2eb09337215d25b28cb3b9c5353df4b28f04da4d785aed"} Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.818165 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="131536829c79cde11e2eb09337215d25b28cb3b9c5353df4b28f04da4d785aed" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.818181 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch2l7" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.824660 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7746fa7e-51b6-4a24-bd44-f455e06d7b79","Type":"ContainerDied","Data":"306daab18b0c32ed664b70befcb8916f80006291c564625f8e2347e039125a38"} Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.824745 4954 scope.go:117] "RemoveContainer" containerID="351efef5a5685c17294c51913270c83186600756acd928d1dc933af87a449277" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.825027 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.874700 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.889449 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.909397 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:21 crc kubenswrapper[4954]: E1127 16:59:21.910050 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-log" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910070 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-log" Nov 27 16:59:21 crc kubenswrapper[4954]: E1127 16:59:21.910117 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" containerName="keystone-bootstrap" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910125 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" containerName="keystone-bootstrap" Nov 27 16:59:21 crc kubenswrapper[4954]: E1127 16:59:21.910140 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-httpd" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910147 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-httpd" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910389 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-httpd" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910412 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" containerName="keystone-bootstrap" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.910426 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" containerName="glance-log" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.911698 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.914620 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.914858 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936190 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936233 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936266 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wk5\" (UniqueName: \"kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936353 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.936541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:21 crc kubenswrapper[4954]: I1127 16:59:21.943158 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.039365 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.039419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.039452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wk5\" (UniqueName: \"kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.039495 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.040202 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.040251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.040445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.040974 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.046019 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.046866 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.040927 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.052148 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.052347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.080160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.087337 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.094428 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wk5\" (UniqueName: \"kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.130276 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.233569 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ch2l7"] Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.239798 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.243210 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ch2l7"] Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.326985 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cs55t"] Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.329143 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.331496 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.331752 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.332204 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.332891 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cdxsk" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.334573 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.338255 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cs55t"] Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.367925 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.367991 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.368469 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8s22\" (UniqueName: \"kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.368608 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.368674 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.368843 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471113 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8s22\" (UniqueName: \"kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471213 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.471318 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.480411 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.480615 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.481038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.481047 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.490287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.494488 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8s22\" (UniqueName: \"kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22\") pod \"keystone-bootstrap-cs55t\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.660250 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.675890 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7746fa7e-51b6-4a24-bd44-f455e06d7b79" path="/var/lib/kubelet/pods/7746fa7e-51b6-4a24-bd44-f455e06d7b79/volumes" Nov 27 16:59:22 crc kubenswrapper[4954]: I1127 16:59:22.677097 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98efb041-1cba-4ea8-b7ae-a84bec3ed2c1" path="/var/lib/kubelet/pods/98efb041-1cba-4ea8-b7ae-a84bec3ed2c1/volumes" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.718361 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.727462 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802497 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802529 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802658 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802784 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn8zt\" (UniqueName: \"kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802814 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802893 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m55wv\" (UniqueName: \"kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.802973 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803098 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803211 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803354 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc\") pod \"54d94327-e1e4-4a52-89c2-d698ded5706f\" (UID: \"54d94327-e1e4-4a52-89c2-d698ded5706f\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803425 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803922 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs" (OuterVolumeSpecName: "logs") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.803471 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data\") pod \"99048c42-ed0d-4cb5-9dce-927cb0d99722\" (UID: \"99048c42-ed0d-4cb5-9dce-927cb0d99722\") " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.804707 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.804730 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99048c42-ed0d-4cb5-9dce-927cb0d99722-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.814652 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt" (OuterVolumeSpecName: "kube-api-access-tn8zt") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "kube-api-access-tn8zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.814925 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv" (OuterVolumeSpecName: "kube-api-access-m55wv") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "kube-api-access-m55wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.816347 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.818280 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts" (OuterVolumeSpecName: "scripts") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.850617 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.860188 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.859918 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.861839 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.861831 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99048c42-ed0d-4cb5-9dce-927cb0d99722","Type":"ContainerDied","Data":"57903b95e99a85acaac662cc20c0c92d309e22e6b76fa0fc78eeb1be03e27191"} Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.866959 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" event={"ID":"54d94327-e1e4-4a52-89c2-d698ded5706f","Type":"ContainerDied","Data":"ba9ac8fe3f92d04f4385bd972bda359d4ed7bcd94a9fa254f216de7ec31ae5b6"} Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.866971 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.867025 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.881145 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config" (OuterVolumeSpecName: "config") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.893107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data" (OuterVolumeSpecName: "config-data") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.893261 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "99048c42-ed0d-4cb5-9dce-927cb0d99722" (UID: "99048c42-ed0d-4cb5-9dce-927cb0d99722"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906745 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906779 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906796 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906809 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906824 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99048c42-ed0d-4cb5-9dce-927cb0d99722-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906838 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn8zt\" (UniqueName: \"kubernetes.io/projected/99048c42-ed0d-4cb5-9dce-927cb0d99722-kube-api-access-tn8zt\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906887 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906902 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m55wv\" (UniqueName: \"kubernetes.io/projected/54d94327-e1e4-4a52-89c2-d698ded5706f-kube-api-access-m55wv\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906916 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906927 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.906940 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.911069 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54d94327-e1e4-4a52-89c2-d698ded5706f" (UID: "54d94327-e1e4-4a52-89c2-d698ded5706f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:23 crc kubenswrapper[4954]: I1127 16:59:23.932298 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.009213 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d94327-e1e4-4a52-89c2-d698ded5706f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.009253 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.213472 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.229997 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.238950 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.247202 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-2dqhs"] Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.256875 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:24 crc kubenswrapper[4954]: E1127 16:59:24.257403 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257417 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" Nov 27 16:59:24 crc kubenswrapper[4954]: E1127 16:59:24.257434 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="init" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257441 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="init" Nov 27 16:59:24 crc kubenswrapper[4954]: E1127 16:59:24.257455 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-httpd" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257462 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-httpd" Nov 27 16:59:24 crc kubenswrapper[4954]: E1127 16:59:24.257487 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-log" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257494 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-log" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257735 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257750 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-httpd" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.257764 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" containerName="glance-log" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.258930 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.262169 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.269400 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.273979 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.316631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.316978 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317179 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317509 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-547tq\" (UniqueName: \"kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317681 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.317768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420038 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420208 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420357 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-547tq\" (UniqueName: \"kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.420975 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.421207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.421450 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.428402 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.429179 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.430442 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.432218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.463611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.464011 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-547tq\" (UniqueName: \"kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq\") pod \"glance-default-external-api-0\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.584480 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.676244 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" path="/var/lib/kubelet/pods/54d94327-e1e4-4a52-89c2-d698ded5706f/volumes" Nov 27 16:59:24 crc kubenswrapper[4954]: I1127 16:59:24.677211 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99048c42-ed0d-4cb5-9dce-927cb0d99722" path="/var/lib/kubelet/pods/99048c42-ed0d-4cb5-9dce-927cb0d99722/volumes" Nov 27 16:59:26 crc kubenswrapper[4954]: I1127 16:59:26.412239 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-2dqhs" podUID="54d94327-e1e4-4a52-89c2-d698ded5706f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Nov 27 16:59:27 crc kubenswrapper[4954]: I1127 16:59:27.910355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerStarted","Data":"486d8412009a68bdc35ae95e26ee40d6f77d4a5c03e7e5b470ef0632abe3bea0"} Nov 27 16:59:34 crc kubenswrapper[4954]: I1127 16:59:34.930228 4954 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda5fcc21-2130-46ca-ab19-fe735802b2af"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda5fcc21-2130-46ca-ab19-fe735802b2af] : Timed out while waiting for systemd to remove kubepods-besteffort-podda5fcc21_2130_46ca_ab19_fe735802b2af.slice" Nov 27 16:59:37 crc kubenswrapper[4954]: E1127 16:59:37.194628 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 27 16:59:37 crc kubenswrapper[4954]: E1127 16:59:37.195235 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gmpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-6vl85_openstack(0123682b-b80c-436f-bf07-6252dc3df9bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:59:37 crc kubenswrapper[4954]: E1127 16:59:37.196511 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-6vl85" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" Nov 27 16:59:37 crc kubenswrapper[4954]: E1127 16:59:37.644720 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 27 16:59:37 crc kubenswrapper[4954]: E1127 16:59:37.645072 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch655hdbh5ffhd6h65ch575h679hfdh8bh5d4h569h5cbh87h666h5d5h5bhcch75h658h688hf7h65bh597h99h699hfbh56ch658h557h57dh699q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gwvpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(70a1a927-b24a-4da3-93f1-9dc67f75c4ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:59:38 crc kubenswrapper[4954]: I1127 16:59:38.006227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5c6d8894-l7bzv" event={"ID":"11ddebaa-610a-410a-a161-a5b89d87eb75","Type":"ContainerStarted","Data":"4fea449e29c9d2ad2d8a85217ad1f631e65db81e663306f5126a85008e1b0d82"} Nov 27 16:59:38 crc kubenswrapper[4954]: E1127 16:59:38.007213 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-6vl85" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" Nov 27 16:59:39 crc kubenswrapper[4954]: E1127 16:59:39.915084 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 27 16:59:39 crc kubenswrapper[4954]: E1127 16:59:39.916055 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t62km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-x4n64_openstack(58c181b9-bc11-4747-84ad-5302f1265507): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:59:39 crc kubenswrapper[4954]: E1127 16:59:39.917231 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-x4n64" podUID="58c181b9-bc11-4747-84ad-5302f1265507" Nov 27 16:59:40 crc kubenswrapper[4954]: E1127 16:59:40.023952 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-x4n64" podUID="58c181b9-bc11-4747-84ad-5302f1265507" Nov 27 16:59:40 crc kubenswrapper[4954]: E1127 16:59:40.770367 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 27 16:59:40 crc kubenswrapper[4954]: E1127 16:59:40.770727 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnljs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hwpt7_openstack(1bce3669-a584-4f00-8043-90be729c9fa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:59:40 crc kubenswrapper[4954]: E1127 16:59:40.771931 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hwpt7" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" Nov 27 16:59:40 crc kubenswrapper[4954]: I1127 16:59:40.830896 4954 scope.go:117] "RemoveContainer" containerID="cb1a44616c475c69ee1fc69126fbdc2f38f8fd6b71246ee808a7299571e3eb1b" Nov 27 16:59:41 crc kubenswrapper[4954]: E1127 16:59:41.043746 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hwpt7" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.248278 4954 scope.go:117] "RemoveContainer" containerID="ae748553ab7db97b310cf20f562828ba5f390feb1c02d698361198f8d8f985d0" Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.416953 4954 scope.go:117] "RemoveContainer" containerID="83dab37ab68eb7d46fd711aada3dcc209fb1c3ddb6eda650646f0d61c5986edc" Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.466613 4954 scope.go:117] "RemoveContainer" containerID="308a103851221e724cf470d87e60057e2aca64bf35d734eef3d3f8c91b3c939b" Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.566934 4954 scope.go:117] "RemoveContainer" containerID="284ffcb58f2793927b7cfbeadc47ceddc51b61f20b2d224e333c92d3f400ae96" Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.712177 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cs55t"] Nov 27 16:59:41 crc kubenswrapper[4954]: W1127 16:59:41.715314 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9758394_0bfc_487a_99b4_a3583a2c97b0.slice/crio-a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8 WatchSource:0}: Error finding container a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8: Status 404 returned error can't find the container with id a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8 Nov 27 16:59:41 crc kubenswrapper[4954]: W1127 16:59:41.944756 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice/crio-89817459e4827cf5bb3e3f4f3fe112bb8c315811261e793a3eb92bf884d3fdef WatchSource:0}: Error finding container 89817459e4827cf5bb3e3f4f3fe112bb8c315811261e793a3eb92bf884d3fdef: Status 404 returned error can't find the container with id 89817459e4827cf5bb3e3f4f3fe112bb8c315811261e793a3eb92bf884d3fdef Nov 27 16:59:41 crc kubenswrapper[4954]: I1127 16:59:41.946893 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.062849 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerStarted","Data":"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.062895 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerStarted","Data":"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.063026 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d65d5b797-gbgfp" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon-log" containerID="cri-o://5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.064733 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d65d5b797-gbgfp" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon" containerID="cri-o://eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.068472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerStarted","Data":"89817459e4827cf5bb3e3f4f3fe112bb8c315811261e793a3eb92bf884d3fdef"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.070504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerStarted","Data":"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.070545 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerStarted","Data":"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.071597 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-855db5c9c7-gpqq9" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon-log" containerID="cri-o://4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.071715 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-855db5c9c7-gpqq9" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon" containerID="cri-o://42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.083174 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5c6d8894-l7bzv" event={"ID":"11ddebaa-610a-410a-a161-a5b89d87eb75","Type":"ContainerStarted","Data":"80fc09e0a51ef20023087c30d3f49f4b4c10e19ee608105601503406096f69ba"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.090990 4954 generic.go:334] "Generic (PLEG): container finished" podID="50892b2e-4e6f-4794-bb8d-e649a9b223fc" containerID="5a552d795c4a9f561604e4aa4659efec65258503d374da8b32c15c2f8f7c5d4b" exitCode=0 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.091062 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fcrnt" event={"ID":"50892b2e-4e6f-4794-bb8d-e649a9b223fc","Type":"ContainerDied","Data":"5a552d795c4a9f561604e4aa4659efec65258503d374da8b32c15c2f8f7c5d4b"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.093520 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d65d5b797-gbgfp" podStartSLOduration=2.939441682 podStartE2EDuration="37.093495337s" podCreationTimestamp="2025-11-27 16:59:05 +0000 UTC" firstStartedPulling="2025-11-27 16:59:06.677318737 +0000 UTC m=+1258.694759047" lastFinishedPulling="2025-11-27 16:59:40.831372362 +0000 UTC m=+1292.848812702" observedRunningTime="2025-11-27 16:59:42.085947614 +0000 UTC m=+1294.103387914" watchObservedRunningTime="2025-11-27 16:59:42.093495337 +0000 UTC m=+1294.110935637" Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.105304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerStarted","Data":"633d8889f9cefc99558e758667563a1550d81c6fe739e5cc9b4b8be68d4f31c9"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.106474 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cs55t" event={"ID":"b9758394-0bfc-487a-99b4-a3583a2c97b0","Type":"ContainerStarted","Data":"96ab67ada370f118a852be06f22b4780ef4f10b62ac840007ffbf097403f3c43"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.106503 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cs55t" event={"ID":"b9758394-0bfc-487a-99b4-a3583a2c97b0","Type":"ContainerStarted","Data":"a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.111838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerStarted","Data":"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.119271 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerStarted","Data":"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.119335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerStarted","Data":"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b"} Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.119479 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7755474f4f-2m4z8" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon-log" containerID="cri-o://712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.119643 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7755474f4f-2m4z8" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon" containerID="cri-o://f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" gracePeriod=30 Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.130044 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-855db5c9c7-gpqq9" podStartSLOduration=3.524903387 podStartE2EDuration="40.130019254s" podCreationTimestamp="2025-11-27 16:59:02 +0000 UTC" firstStartedPulling="2025-11-27 16:59:04.772070527 +0000 UTC m=+1256.789510827" lastFinishedPulling="2025-11-27 16:59:41.377186394 +0000 UTC m=+1293.394626694" observedRunningTime="2025-11-27 16:59:42.113610525 +0000 UTC m=+1294.131050825" watchObservedRunningTime="2025-11-27 16:59:42.130019254 +0000 UTC m=+1294.147459554" Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.161729 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cs55t" podStartSLOduration=20.161710254 podStartE2EDuration="20.161710254s" podCreationTimestamp="2025-11-27 16:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:42.155295758 +0000 UTC m=+1294.172736058" watchObservedRunningTime="2025-11-27 16:59:42.161710254 +0000 UTC m=+1294.179150554" Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.196141 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7755474f4f-2m4z8" podStartSLOduration=3.447786105 podStartE2EDuration="39.196112659s" podCreationTimestamp="2025-11-27 16:59:03 +0000 UTC" firstStartedPulling="2025-11-27 16:59:05.168117833 +0000 UTC m=+1257.185558133" lastFinishedPulling="2025-11-27 16:59:40.916444387 +0000 UTC m=+1292.933884687" observedRunningTime="2025-11-27 16:59:42.182007336 +0000 UTC m=+1294.199447656" watchObservedRunningTime="2025-11-27 16:59:42.196112659 +0000 UTC m=+1294.213552959" Nov 27 16:59:42 crc kubenswrapper[4954]: I1127 16:59:42.700061 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.136971 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerStarted","Data":"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22"} Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.146769 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5c6d8894-l7bzv" event={"ID":"11ddebaa-610a-410a-a161-a5b89d87eb75","Type":"ContainerStarted","Data":"5692def6805698238f8e33a279f2af7daa7f8b8a5293a4047b076a829780cf2e"} Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.157419 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerStarted","Data":"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe"} Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.172073 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b5c6d8894-l7bzv" podStartSLOduration=27.020055573 podStartE2EDuration="31.172046435s" podCreationTimestamp="2025-11-27 16:59:12 +0000 UTC" firstStartedPulling="2025-11-27 16:59:37.220909138 +0000 UTC m=+1289.238349438" lastFinishedPulling="2025-11-27 16:59:41.37289999 +0000 UTC m=+1293.390340300" observedRunningTime="2025-11-27 16:59:43.164596064 +0000 UTC m=+1295.182036364" watchObservedRunningTime="2025-11-27 16:59:43.172046435 +0000 UTC m=+1295.189486745" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.174376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerStarted","Data":"c7cdbb42a093519a5bee22c256db4e19c57cf8f292fc41fe8f3ea328d08e4a51"} Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.217919 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6549c6cdd4-szxmh" podStartSLOduration=17.361820644 podStartE2EDuration="31.217897819s" podCreationTimestamp="2025-11-27 16:59:12 +0000 UTC" firstStartedPulling="2025-11-27 16:59:27.516820665 +0000 UTC m=+1279.534260965" lastFinishedPulling="2025-11-27 16:59:41.37289784 +0000 UTC m=+1293.390338140" observedRunningTime="2025-11-27 16:59:43.18788271 +0000 UTC m=+1295.205323020" watchObservedRunningTime="2025-11-27 16:59:43.217897819 +0000 UTC m=+1295.235338119" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.277673 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.651160 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.680437 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config\") pod \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.680822 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle\") pod \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.680891 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vrc5\" (UniqueName: \"kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5\") pod \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\" (UID: \"50892b2e-4e6f-4794-bb8d-e649a9b223fc\") " Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.703880 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5" (OuterVolumeSpecName: "kube-api-access-8vrc5") pod "50892b2e-4e6f-4794-bb8d-e649a9b223fc" (UID: "50892b2e-4e6f-4794-bb8d-e649a9b223fc"). InnerVolumeSpecName "kube-api-access-8vrc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.756738 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config" (OuterVolumeSpecName: "config") pod "50892b2e-4e6f-4794-bb8d-e649a9b223fc" (UID: "50892b2e-4e6f-4794-bb8d-e649a9b223fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.778725 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50892b2e-4e6f-4794-bb8d-e649a9b223fc" (UID: "50892b2e-4e6f-4794-bb8d-e649a9b223fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.784064 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.784116 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50892b2e-4e6f-4794-bb8d-e649a9b223fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:43 crc kubenswrapper[4954]: I1127 16:59:43.784137 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vrc5\" (UniqueName: \"kubernetes.io/projected/50892b2e-4e6f-4794-bb8d-e649a9b223fc-kube-api-access-8vrc5\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.124589 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.198781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerStarted","Data":"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015"} Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.201160 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fcrnt" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.201186 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fcrnt" event={"ID":"50892b2e-4e6f-4794-bb8d-e649a9b223fc","Type":"ContainerDied","Data":"a8e68d9e08b302521405e33205e88197c3552400ccc449a55bc9391074169cce"} Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.201224 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e68d9e08b302521405e33205e88197c3552400ccc449a55bc9391074169cce" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.204473 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerStarted","Data":"61de4b1700431f4f877eee1d3c201d33e230900ebdaea173f19e402aedb7df6f"} Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.429318 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.429299322 podStartE2EDuration="23.429299322s" podCreationTimestamp="2025-11-27 16:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:44.229769528 +0000 UTC m=+1296.247209838" watchObservedRunningTime="2025-11-27 16:59:44.429299322 +0000 UTC m=+1296.446739622" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.440641 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 16:59:44 crc kubenswrapper[4954]: E1127 16:59:44.441184 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50892b2e-4e6f-4794-bb8d-e649a9b223fc" containerName="neutron-db-sync" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.441202 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="50892b2e-4e6f-4794-bb8d-e649a9b223fc" containerName="neutron-db-sync" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.441412 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="50892b2e-4e6f-4794-bb8d-e649a9b223fc" containerName="neutron-db-sync" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.442647 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.463230 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.486498 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.488326 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.494137 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.494323 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7mmjt" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.494397 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.494605 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.513463 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610233 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610546 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610692 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610724 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qblq4\" (UniqueName: \"kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610781 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ts49\" (UniqueName: \"kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610851 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610914 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.610933 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.712694 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.712748 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713405 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713456 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713477 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713518 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qblq4\" (UniqueName: \"kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713535 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713562 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ts49\" (UniqueName: \"kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713661 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.713742 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.714415 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.716051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.716555 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.716819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.723347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.724388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.726872 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.730408 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.738786 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ts49\" (UniqueName: \"kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49\") pod \"dnsmasq-dns-5ccc5c4795-mnxqd\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.739342 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qblq4\" (UniqueName: \"kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4\") pod \"neutron-5657c85556-sq27w\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.812989 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:44 crc kubenswrapper[4954]: I1127 16:59:44.841971 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:45 crc kubenswrapper[4954]: I1127 16:59:45.244758 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerStarted","Data":"8c2b287a97e7a4c254ce5883c5b166be57a3c4746fa5c5e10345750028c6dc33"} Nov 27 16:59:45 crc kubenswrapper[4954]: I1127 16:59:45.298378 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=21.298352904 podStartE2EDuration="21.298352904s" podCreationTimestamp="2025-11-27 16:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:45.292888431 +0000 UTC m=+1297.310328731" watchObservedRunningTime="2025-11-27 16:59:45.298352904 +0000 UTC m=+1297.315793204" Nov 27 16:59:45 crc kubenswrapper[4954]: I1127 16:59:45.600850 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 16:59:46 crc kubenswrapper[4954]: I1127 16:59:46.131183 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 16:59:46 crc kubenswrapper[4954]: I1127 16:59:46.285694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" event={"ID":"a242acad-7998-4797-930e-9a119e9b0e64","Type":"ContainerStarted","Data":"f85787f4e18e6e7396428fe7d34c41066bba9fea06f15c773646e7e0912a7dff"} Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.019946 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.262019 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f6cfb75df-7gbdb"] Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.263443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.268238 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.268609 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.276169 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6cfb75df-7gbdb"] Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.312121 4954 generic.go:334] "Generic (PLEG): container finished" podID="a242acad-7998-4797-930e-9a119e9b0e64" containerID="bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6" exitCode=0 Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.312187 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" event={"ID":"a242acad-7998-4797-930e-9a119e9b0e64","Type":"ContainerDied","Data":"bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6"} Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.400771 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-httpd-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.400999 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-public-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.401188 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-combined-ca-bundle\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.401264 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-ovndb-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.401359 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-internal-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.401442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.401507 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b9gx\" (UniqueName: \"kubernetes.io/projected/3c0fe668-ab8d-4bad-acdd-da6d230de548-kube-api-access-8b9gx\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503306 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-httpd-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503380 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-public-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503463 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-combined-ca-bundle\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-ovndb-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503521 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-internal-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503553 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.503573 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b9gx\" (UniqueName: \"kubernetes.io/projected/3c0fe668-ab8d-4bad-acdd-da6d230de548-kube-api-access-8b9gx\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.514643 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-ovndb-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.516388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-httpd-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.518283 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-public-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.526548 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-internal-tls-certs\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.534871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-combined-ca-bundle\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.537005 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b9gx\" (UniqueName: \"kubernetes.io/projected/3c0fe668-ab8d-4bad-acdd-da6d230de548-kube-api-access-8b9gx\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.547834 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c0fe668-ab8d-4bad-acdd-da6d230de548-config\") pod \"neutron-f6cfb75df-7gbdb\" (UID: \"3c0fe668-ab8d-4bad-acdd-da6d230de548\") " pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:47 crc kubenswrapper[4954]: I1127 16:59:47.593243 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:48 crc kubenswrapper[4954]: I1127 16:59:48.325715 4954 generic.go:334] "Generic (PLEG): container finished" podID="b9758394-0bfc-487a-99b4-a3583a2c97b0" containerID="96ab67ada370f118a852be06f22b4780ef4f10b62ac840007ffbf097403f3c43" exitCode=0 Nov 27 16:59:48 crc kubenswrapper[4954]: I1127 16:59:48.325755 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cs55t" event={"ID":"b9758394-0bfc-487a-99b4-a3583a2c97b0","Type":"ContainerDied","Data":"96ab67ada370f118a852be06f22b4780ef4f10b62ac840007ffbf097403f3c43"} Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.828869 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951696 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951753 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8s22\" (UniqueName: \"kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951858 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951877 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951953 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.951978 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys\") pod \"b9758394-0bfc-487a-99b4-a3583a2c97b0\" (UID: \"b9758394-0bfc-487a-99b4-a3583a2c97b0\") " Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.958808 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.959162 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.973556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22" (OuterVolumeSpecName: "kube-api-access-x8s22") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "kube-api-access-x8s22". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.978646 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts" (OuterVolumeSpecName: "scripts") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.985698 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:49 crc kubenswrapper[4954]: I1127 16:59:49.985776 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data" (OuterVolumeSpecName: "config-data") pod "b9758394-0bfc-487a-99b4-a3583a2c97b0" (UID: "b9758394-0bfc-487a-99b4-a3583a2c97b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053818 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053854 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053863 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053872 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053882 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9758394-0bfc-487a-99b4-a3583a2c97b0-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.053890 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8s22\" (UniqueName: \"kubernetes.io/projected/b9758394-0bfc-487a-99b4-a3583a2c97b0-kube-api-access-x8s22\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.351186 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerStarted","Data":"38e064080c4eac7f0c9b0fda1b374b21fff521b22ee9dccf9164f69f74aa4514"} Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.353709 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cs55t" event={"ID":"b9758394-0bfc-487a-99b4-a3583a2c97b0","Type":"ContainerDied","Data":"a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8"} Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.353753 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2925113b84499a2c4d58646871dbee25e13bce49f8258c60f18ac334d9334b8" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.354235 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cs55t" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.550916 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-68775c76df-2ppbs"] Nov 27 16:59:50 crc kubenswrapper[4954]: E1127 16:59:50.551353 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9758394-0bfc-487a-99b4-a3583a2c97b0" containerName="keystone-bootstrap" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.551370 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9758394-0bfc-487a-99b4-a3583a2c97b0" containerName="keystone-bootstrap" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.551607 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9758394-0bfc-487a-99b4-a3583a2c97b0" containerName="keystone-bootstrap" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.552248 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.559761 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.560250 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.560453 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.560468 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.560837 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.560836 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cdxsk" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.596955 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-68775c76df-2ppbs"] Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679345 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn8qb\" (UniqueName: \"kubernetes.io/projected/a541738e-915f-413b-9b84-d57553ebc170-kube-api-access-zn8qb\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-fernet-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-internal-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679559 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-scripts\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679610 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-credential-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679640 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-public-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-config-data\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.679734 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-combined-ca-bundle\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781167 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-config-data\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-combined-ca-bundle\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781339 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn8qb\" (UniqueName: \"kubernetes.io/projected/a541738e-915f-413b-9b84-d57553ebc170-kube-api-access-zn8qb\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781363 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-fernet-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781432 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-internal-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781598 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-scripts\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781632 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-credential-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.781662 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-public-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.794527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-public-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.799237 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-fernet-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.799307 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-config-data\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.802899 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-scripts\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.802208 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-combined-ca-bundle\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.816278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-credential-keys\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.817146 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn8qb\" (UniqueName: \"kubernetes.io/projected/a541738e-915f-413b-9b84-d57553ebc170-kube-api-access-zn8qb\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.817224 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a541738e-915f-413b-9b84-d57553ebc170-internal-tls-certs\") pod \"keystone-68775c76df-2ppbs\" (UID: \"a541738e-915f-413b-9b84-d57553ebc170\") " pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:50 crc kubenswrapper[4954]: I1127 16:59:50.894446 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.240051 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.243065 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.243111 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.243125 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.295301 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.322968 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.543594 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.543908 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.547765 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.641088 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.645099 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.650127 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b5c6d8894-l7bzv" podUID="11ddebaa-610a-410a-a161-a5b89d87eb75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 16:59:52 crc kubenswrapper[4954]: I1127 16:59:52.819139 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-68775c76df-2ppbs"] Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.051833 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6cfb75df-7gbdb"] Nov 27 16:59:53 crc kubenswrapper[4954]: W1127 16:59:53.062773 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0fe668_ab8d_4bad_acdd_da6d230de548.slice/crio-77567f859b33d4e46941a4bad2bd9856e96818aa15b038d30be5a5e2db229ea9 WatchSource:0}: Error finding container 77567f859b33d4e46941a4bad2bd9856e96818aa15b038d30be5a5e2db229ea9: Status 404 returned error can't find the container with id 77567f859b33d4e46941a4bad2bd9856e96818aa15b038d30be5a5e2db229ea9 Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.382443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerStarted","Data":"4393c3ec99187d36baafd3a746d542662ad0eb0e5ceb35b60f2d4a600e291fea"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.382490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerStarted","Data":"48a8afd527e4cec4880cfb003cfaec9aff3c9c8c58342474a1b77e5c21366e88"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.383011 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5657c85556-sq27w" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.383721 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6vl85" event={"ID":"0123682b-b80c-436f-bf07-6252dc3df9bc","Type":"ContainerStarted","Data":"4dd8cb7ee521604965cffb6715b2ec94f9b2a1336df00b7533b148e731686fb0"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.386975 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6cfb75df-7gbdb" event={"ID":"3c0fe668-ab8d-4bad-acdd-da6d230de548","Type":"ContainerStarted","Data":"77567f859b33d4e46941a4bad2bd9856e96818aa15b038d30be5a5e2db229ea9"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.389287 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerStarted","Data":"2f08b7ce7474b7e23ee72ce58fe803c1638962b91e1c59b50e005beb8b358208"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.391299 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-68775c76df-2ppbs" event={"ID":"a541738e-915f-413b-9b84-d57553ebc170","Type":"ContainerStarted","Data":"015188f0aecb7aa7af69e24fc382279a729e7127874a2df292954e528c88dd5e"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.391341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-68775c76df-2ppbs" event={"ID":"a541738e-915f-413b-9b84-d57553ebc170","Type":"ContainerStarted","Data":"610e0625774e76d11b5f19ae9b0ae5f1ff4d5cd0a9dfab8e2e6f3984d54f28d5"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.391977 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-68775c76df-2ppbs" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.394198 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" event={"ID":"a242acad-7998-4797-930e-9a119e9b0e64","Type":"ContainerStarted","Data":"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0"} Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.394940 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.430592 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5657c85556-sq27w" podStartSLOduration=9.430556979 podStartE2EDuration="9.430556979s" podCreationTimestamp="2025-11-27 16:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:53.410883962 +0000 UTC m=+1305.428324262" watchObservedRunningTime="2025-11-27 16:59:53.430556979 +0000 UTC m=+1305.447997279" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.438696 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6vl85" podStartSLOduration=2.684118243 podStartE2EDuration="50.438676237s" podCreationTimestamp="2025-11-27 16:59:03 +0000 UTC" firstStartedPulling="2025-11-27 16:59:04.778407971 +0000 UTC m=+1256.795848271" lastFinishedPulling="2025-11-27 16:59:52.532965965 +0000 UTC m=+1304.550406265" observedRunningTime="2025-11-27 16:59:53.424408701 +0000 UTC m=+1305.441849001" watchObservedRunningTime="2025-11-27 16:59:53.438676237 +0000 UTC m=+1305.456116537" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.443766 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" podStartSLOduration=9.443748 podStartE2EDuration="9.443748s" podCreationTimestamp="2025-11-27 16:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:53.439193249 +0000 UTC m=+1305.456633549" watchObservedRunningTime="2025-11-27 16:59:53.443748 +0000 UTC m=+1305.461188300" Nov 27 16:59:53 crc kubenswrapper[4954]: I1127 16:59:53.470032 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-68775c76df-2ppbs" podStartSLOduration=3.470007207 podStartE2EDuration="3.470007207s" podCreationTimestamp="2025-11-27 16:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:53.462142457 +0000 UTC m=+1305.479582757" watchObservedRunningTime="2025-11-27 16:59:53.470007207 +0000 UTC m=+1305.487447507" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.410891 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6cfb75df-7gbdb" event={"ID":"3c0fe668-ab8d-4bad-acdd-da6d230de548","Type":"ContainerStarted","Data":"719697f4a4aa24acdfc1d70820fa59c83bb94e35186fbb73371d9b0360b5360e"} Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.412666 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6cfb75df-7gbdb" event={"ID":"3c0fe668-ab8d-4bad-acdd-da6d230de548","Type":"ContainerStarted","Data":"26074fd4bd9d6d23a483bdf4fdfe221a7ceb6be6e8266b5c948fa5bf81faed32"} Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.412755 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.423358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x4n64" event={"ID":"58c181b9-bc11-4747-84ad-5302f1265507","Type":"ContainerStarted","Data":"c7f389f6069feb0c78353dad9ae7b9a0245dfcd17c6f4f3ea3f1ab0fbba286e8"} Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.426865 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hwpt7" event={"ID":"1bce3669-a584-4f00-8043-90be729c9fa7","Type":"ContainerStarted","Data":"901d635a5ed3dc985c4adf2144e2377826394d07154039a33c23b126755f620f"} Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.465283 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f6cfb75df-7gbdb" podStartSLOduration=7.465260693 podStartE2EDuration="7.465260693s" podCreationTimestamp="2025-11-27 16:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:54.464860743 +0000 UTC m=+1306.482301043" watchObservedRunningTime="2025-11-27 16:59:54.465260693 +0000 UTC m=+1306.482700993" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.488027 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hwpt7" podStartSLOduration=3.013285454 podStartE2EDuration="52.488006345s" podCreationTimestamp="2025-11-27 16:59:02 +0000 UTC" firstStartedPulling="2025-11-27 16:59:04.701142565 +0000 UTC m=+1256.718582865" lastFinishedPulling="2025-11-27 16:59:54.175863456 +0000 UTC m=+1306.193303756" observedRunningTime="2025-11-27 16:59:54.481495247 +0000 UTC m=+1306.498935547" watchObservedRunningTime="2025-11-27 16:59:54.488006345 +0000 UTC m=+1306.505446645" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.521610 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x4n64" podStartSLOduration=4.527903129 podStartE2EDuration="52.52157026s" podCreationTimestamp="2025-11-27 16:59:02 +0000 UTC" firstStartedPulling="2025-11-27 16:59:04.484253638 +0000 UTC m=+1256.501693938" lastFinishedPulling="2025-11-27 16:59:52.477920759 +0000 UTC m=+1304.495361069" observedRunningTime="2025-11-27 16:59:54.508969154 +0000 UTC m=+1306.526409454" watchObservedRunningTime="2025-11-27 16:59:54.52157026 +0000 UTC m=+1306.539010560" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.585845 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.586184 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.586249 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.586307 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.651897 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:59:54 crc kubenswrapper[4954]: I1127 16:59:54.743018 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:59:55 crc kubenswrapper[4954]: I1127 16:59:55.210800 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:55 crc kubenswrapper[4954]: I1127 16:59:55.210985 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:59:55 crc kubenswrapper[4954]: I1127 16:59:55.214336 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:59:56 crc kubenswrapper[4954]: I1127 16:59:56.451638 4954 generic.go:334] "Generic (PLEG): container finished" podID="0123682b-b80c-436f-bf07-6252dc3df9bc" containerID="4dd8cb7ee521604965cffb6715b2ec94f9b2a1336df00b7533b148e731686fb0" exitCode=0 Nov 27 16:59:56 crc kubenswrapper[4954]: I1127 16:59:56.451734 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6vl85" event={"ID":"0123682b-b80c-436f-bf07-6252dc3df9bc","Type":"ContainerDied","Data":"4dd8cb7ee521604965cffb6715b2ec94f9b2a1336df00b7533b148e731686fb0"} Nov 27 16:59:57 crc kubenswrapper[4954]: I1127 16:59:57.916506 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:59:57 crc kubenswrapper[4954]: I1127 16:59:57.917040 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:59:57 crc kubenswrapper[4954]: I1127 16:59:57.950661 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:59:59 crc kubenswrapper[4954]: I1127 16:59:59.815743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 16:59:59 crc kubenswrapper[4954]: I1127 16:59:59.962652 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 16:59:59 crc kubenswrapper[4954]: I1127 16:59:59.962992 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="dnsmasq-dns" containerID="cri-o://879f25bd4ee51940fc5b66b39fcc4ea7f2469e04728e79af3fff58c5b95a762c" gracePeriod=10 Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.187708 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q"] Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.188936 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.201509 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.201776 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.241699 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q"] Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.317010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.317119 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.317137 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k78lp\" (UniqueName: \"kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.418653 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.418754 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.418780 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k78lp\" (UniqueName: \"kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.419901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.440509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.471294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k78lp\" (UniqueName: \"kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp\") pod \"collect-profiles-29404380-tn87q\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.497736 4954 generic.go:334] "Generic (PLEG): container finished" podID="1bce3669-a584-4f00-8043-90be729c9fa7" containerID="901d635a5ed3dc985c4adf2144e2377826394d07154039a33c23b126755f620f" exitCode=0 Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.497823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hwpt7" event={"ID":"1bce3669-a584-4f00-8043-90be729c9fa7","Type":"ContainerDied","Data":"901d635a5ed3dc985c4adf2144e2377826394d07154039a33c23b126755f620f"} Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.507938 4954 generic.go:334] "Generic (PLEG): container finished" podID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerID="879f25bd4ee51940fc5b66b39fcc4ea7f2469e04728e79af3fff58c5b95a762c" exitCode=0 Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.507998 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" event={"ID":"49b7b3ea-3919-4d95-9fc8-138aef12ee08","Type":"ContainerDied","Data":"879f25bd4ee51940fc5b66b39fcc4ea7f2469e04728e79af3fff58c5b95a762c"} Nov 27 17:00:00 crc kubenswrapper[4954]: I1127 17:00:00.522503 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:02 crc kubenswrapper[4954]: I1127 17:00:02.543783 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Nov 27 17:00:02 crc kubenswrapper[4954]: I1127 17:00:02.635359 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b5c6d8894-l7bzv" podUID="11ddebaa-610a-410a-a161-a5b89d87eb75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.078517 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6vl85" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.176612 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle\") pod \"0123682b-b80c-436f-bf07-6252dc3df9bc\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.176730 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts\") pod \"0123682b-b80c-436f-bf07-6252dc3df9bc\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.176830 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data\") pod \"0123682b-b80c-436f-bf07-6252dc3df9bc\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.176887 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs\") pod \"0123682b-b80c-436f-bf07-6252dc3df9bc\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.176918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gmpt\" (UniqueName: \"kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt\") pod \"0123682b-b80c-436f-bf07-6252dc3df9bc\" (UID: \"0123682b-b80c-436f-bf07-6252dc3df9bc\") " Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.178489 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs" (OuterVolumeSpecName: "logs") pod "0123682b-b80c-436f-bf07-6252dc3df9bc" (UID: "0123682b-b80c-436f-bf07-6252dc3df9bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.179620 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0123682b-b80c-436f-bf07-6252dc3df9bc-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.184726 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts" (OuterVolumeSpecName: "scripts") pod "0123682b-b80c-436f-bf07-6252dc3df9bc" (UID: "0123682b-b80c-436f-bf07-6252dc3df9bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.196813 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt" (OuterVolumeSpecName: "kube-api-access-5gmpt") pod "0123682b-b80c-436f-bf07-6252dc3df9bc" (UID: "0123682b-b80c-436f-bf07-6252dc3df9bc"). InnerVolumeSpecName "kube-api-access-5gmpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.221074 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data" (OuterVolumeSpecName: "config-data") pod "0123682b-b80c-436f-bf07-6252dc3df9bc" (UID: "0123682b-b80c-436f-bf07-6252dc3df9bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.261473 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0123682b-b80c-436f-bf07-6252dc3df9bc" (UID: "0123682b-b80c-436f-bf07-6252dc3df9bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.281473 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.281532 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gmpt\" (UniqueName: \"kubernetes.io/projected/0123682b-b80c-436f-bf07-6252dc3df9bc-kube-api-access-5gmpt\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.281547 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.281560 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123682b-b80c-436f-bf07-6252dc3df9bc-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.549431 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6vl85" event={"ID":"0123682b-b80c-436f-bf07-6252dc3df9bc","Type":"ContainerDied","Data":"7d21d5643ed5890c6c848998060045b79a5d446d078556c7f8ca543a69003d1b"} Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.549484 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d21d5643ed5890c6c848998060045b79a5d446d078556c7f8ca543a69003d1b" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.549450 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6vl85" Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.551726 4954 generic.go:334] "Generic (PLEG): container finished" podID="58c181b9-bc11-4747-84ad-5302f1265507" containerID="c7f389f6069feb0c78353dad9ae7b9a0245dfcd17c6f4f3ea3f1ab0fbba286e8" exitCode=0 Nov 27 17:00:03 crc kubenswrapper[4954]: I1127 17:00:03.551779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x4n64" event={"ID":"58c181b9-bc11-4747-84ad-5302f1265507","Type":"ContainerDied","Data":"c7f389f6069feb0c78353dad9ae7b9a0245dfcd17c6f4f3ea3f1ab0fbba286e8"} Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.213920 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d8d4694bd-z9zk4"] Nov 27 17:00:04 crc kubenswrapper[4954]: E1127 17:00:04.214621 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" containerName="placement-db-sync" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.214638 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" containerName="placement-db-sync" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.214802 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" containerName="placement-db-sync" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.216422 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.220270 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.220521 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.220672 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.222638 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nlnhq" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.229327 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.233136 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8d4694bd-z9zk4"] Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.300740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-scripts\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.300829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-internal-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.300869 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4617263-6b9f-4f0c-af69-9d589143eb12-logs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.301026 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-public-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.301076 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-config-data\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.301140 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-combined-ca-bundle\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.301170 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8n5\" (UniqueName: \"kubernetes.io/projected/a4617263-6b9f-4f0c-af69-9d589143eb12-kube-api-access-xl8n5\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.402923 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4617263-6b9f-4f0c-af69-9d589143eb12-logs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-public-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-config-data\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403116 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-combined-ca-bundle\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8n5\" (UniqueName: \"kubernetes.io/projected/a4617263-6b9f-4f0c-af69-9d589143eb12-kube-api-access-xl8n5\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-scripts\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.403194 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-internal-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.416655 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4617263-6b9f-4f0c-af69-9d589143eb12-logs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.424924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-scripts\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.434078 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8n5\" (UniqueName: \"kubernetes.io/projected/a4617263-6b9f-4f0c-af69-9d589143eb12-kube-api-access-xl8n5\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.434101 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-public-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.434111 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-config-data\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.434302 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-internal-tls-certs\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.435020 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4617263-6b9f-4f0c-af69-9d589143eb12-combined-ca-bundle\") pod \"placement-d8d4694bd-z9zk4\" (UID: \"a4617263-6b9f-4f0c-af69-9d589143eb12\") " pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:04 crc kubenswrapper[4954]: I1127 17:00:04.549923 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.039222 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x4n64" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.068003 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.092283 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hwpt7" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119459 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119519 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119541 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119568 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119623 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119644 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119688 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119712 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t62km\" (UniqueName: \"kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119799 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119825 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts\") pod \"58c181b9-bc11-4747-84ad-5302f1265507\" (UID: \"58c181b9-bc11-4747-84ad-5302f1265507\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.119862 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9pp\" (UniqueName: \"kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp\") pod \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\" (UID: \"49b7b3ea-3919-4d95-9fc8-138aef12ee08\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.124735 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.133672 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp" (OuterVolumeSpecName: "kube-api-access-7f9pp") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "kube-api-access-7f9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.134879 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km" (OuterVolumeSpecName: "kube-api-access-t62km") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "kube-api-access-t62km". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.135153 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.141805 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts" (OuterVolumeSpecName: "scripts") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.211416 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data" (OuterVolumeSpecName: "config-data") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.221843 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data\") pod \"1bce3669-a584-4f00-8043-90be729c9fa7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.221945 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle\") pod \"1bce3669-a584-4f00-8043-90be729c9fa7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.222136 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnljs\" (UniqueName: \"kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs\") pod \"1bce3669-a584-4f00-8043-90be729c9fa7\" (UID: \"1bce3669-a584-4f00-8043-90be729c9fa7\") " Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.222596 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.222639 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.223185 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9pp\" (UniqueName: \"kubernetes.io/projected/49b7b3ea-3919-4d95-9fc8-138aef12ee08-kube-api-access-7f9pp\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.223271 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.223340 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.223453 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58c181b9-bc11-4747-84ad-5302f1265507-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.223523 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t62km\" (UniqueName: \"kubernetes.io/projected/58c181b9-bc11-4747-84ad-5302f1265507-kube-api-access-t62km\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.226766 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs" (OuterVolumeSpecName: "kube-api-access-lnljs") pod "1bce3669-a584-4f00-8043-90be729c9fa7" (UID: "1bce3669-a584-4f00-8043-90be729c9fa7"). InnerVolumeSpecName "kube-api-access-lnljs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.228763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1bce3669-a584-4f00-8043-90be729c9fa7" (UID: "1bce3669-a584-4f00-8043-90be729c9fa7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.235206 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config" (OuterVolumeSpecName: "config") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.237232 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58c181b9-bc11-4747-84ad-5302f1265507" (UID: "58c181b9-bc11-4747-84ad-5302f1265507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.250197 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.256665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.263063 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bce3669-a584-4f00-8043-90be729c9fa7" (UID: "1bce3669-a584-4f00-8043-90be729c9fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.277172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49b7b3ea-3919-4d95-9fc8-138aef12ee08" (UID: "49b7b3ea-3919-4d95-9fc8-138aef12ee08"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325861 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325890 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325900 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325909 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnljs\" (UniqueName: \"kubernetes.io/projected/1bce3669-a584-4f00-8043-90be729c9fa7-kube-api-access-lnljs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325917 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325926 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c181b9-bc11-4747-84ad-5302f1265507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325934 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325942 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bce3669-a584-4f00-8043-90be729c9fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.325950 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49b7b3ea-3919-4d95-9fc8-138aef12ee08-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.575345 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" event={"ID":"49b7b3ea-3919-4d95-9fc8-138aef12ee08","Type":"ContainerDied","Data":"6dd53b7bf588b71cbbdb00b24280019d871d3f6087230ce5f1af0a1572857d0e"} Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.575409 4954 scope.go:117] "RemoveContainer" containerID="879f25bd4ee51940fc5b66b39fcc4ea7f2469e04728e79af3fff58c5b95a762c" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.575935 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.580493 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x4n64" event={"ID":"58c181b9-bc11-4747-84ad-5302f1265507","Type":"ContainerDied","Data":"2131a8bd011c3498f4cef8b62b48ce44f4d909ffb59489c654c82c7153ea56a7"} Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.580533 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2131a8bd011c3498f4cef8b62b48ce44f4d909ffb59489c654c82c7153ea56a7" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.580509 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x4n64" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.589857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hwpt7" event={"ID":"1bce3669-a584-4f00-8043-90be729c9fa7","Type":"ContainerDied","Data":"0817d5cad9fbde486b69c154cc6a3499182ec885314c7f82f9a0854246a51daf"} Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.589913 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0817d5cad9fbde486b69c154cc6a3499182ec885314c7f82f9a0854246a51daf" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.589942 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hwpt7" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.638483 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.673408 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5zrkl"] Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.818328 4954 scope.go:117] "RemoveContainer" containerID="1a5f37be12affa844bec7c198093f616df1ebd9cfe7acaf1a6c1b5c5e1f6f4b2" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.872725 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:05 crc kubenswrapper[4954]: E1127 17:00:05.887033 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="init" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887071 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="init" Nov 27 17:00:05 crc kubenswrapper[4954]: E1127 17:00:05.887090 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c181b9-bc11-4747-84ad-5302f1265507" containerName="cinder-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887109 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c181b9-bc11-4747-84ad-5302f1265507" containerName="cinder-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: E1127 17:00:05.887134 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="dnsmasq-dns" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887141 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="dnsmasq-dns" Nov 27 17:00:05 crc kubenswrapper[4954]: E1127 17:00:05.887155 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" containerName="barbican-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887161 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" containerName="barbican-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887344 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c181b9-bc11-4747-84ad-5302f1265507" containerName="cinder-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887363 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="dnsmasq-dns" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.887374 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" containerName="barbican-db-sync" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.889746 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.891854 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.898236 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.898673 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.898793 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nt8xt" Nov 27 17:00:05 crc kubenswrapper[4954]: I1127 17:00:05.898935 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.019833 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.021629 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053048 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053173 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgfj\" (UniqueName: \"kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053220 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053258 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053304 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.053329 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.067289 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.143724 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.146224 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.149199 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.154279 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155719 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155805 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155853 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155872 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgfj\" (UniqueName: \"kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.155974 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.156007 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtcb6\" (UniqueName: \"kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.157251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.157414 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.157442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.157709 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.164692 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.165062 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.173438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.183193 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgfj\" (UniqueName: \"kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.185108 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.215534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.266567 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.266806 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.266910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcb6\" (UniqueName: \"kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.266935 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267000 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267030 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267081 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267288 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzqm\" (UniqueName: \"kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267384 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267428 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.267442 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.268310 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.269138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.269638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.270154 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.270652 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.299023 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtcb6\" (UniqueName: \"kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6\") pod \"dnsmasq-dns-8775748c9-fwtgk\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.338042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.354897 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7bd6cd4c89-x6dht"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.357846 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.368570 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.369023 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ng4f" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.370083 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzqm\" (UniqueName: \"kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376932 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376956 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.376978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.377005 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.377097 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.378343 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.379383 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.415877 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-798f5f6896-mswxw"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.417657 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.423915 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.428280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.440427 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bd6cd4c89-x6dht"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.452334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.452609 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.452705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.457271 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzqm\" (UniqueName: \"kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm\") pod \"cinder-api-0\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.472367 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f5f6896-mswxw"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.481710 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxds\" (UniqueName: \"kubernetes.io/projected/dc83f9b6-fbea-4463-8127-08590404f021-kube-api-access-ztxds\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.481786 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-combined-ca-bundle\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.481812 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data-custom\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.481881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.481949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83f9b6-fbea-4463-8127-08590404f021-logs\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.507059 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.521174 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586702 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xj8j\" (UniqueName: \"kubernetes.io/projected/e09487f3-5539-4df4-8b9b-6da0b0b741de-kube-api-access-8xj8j\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586790 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxds\" (UniqueName: \"kubernetes.io/projected/dc83f9b6-fbea-4463-8127-08590404f021-kube-api-access-ztxds\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586816 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e09487f3-5539-4df4-8b9b-6da0b0b741de-logs\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-combined-ca-bundle\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data-custom\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586906 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-combined-ca-bundle\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.586945 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data-custom\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.587010 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.587078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83f9b6-fbea-4463-8127-08590404f021-logs\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.587107 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.593185 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-combined-ca-bundle\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.593364 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83f9b6-fbea-4463-8127-08590404f021-logs\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.604438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data-custom\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.624233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83f9b6-fbea-4463-8127-08590404f021-config-data\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.688597 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-combined-ca-bundle\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.688657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data-custom\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.688747 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.688767 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xj8j\" (UniqueName: \"kubernetes.io/projected/e09487f3-5539-4df4-8b9b-6da0b0b741de-kube-api-access-8xj8j\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.688803 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e09487f3-5539-4df4-8b9b-6da0b0b741de-logs\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.689286 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e09487f3-5539-4df4-8b9b-6da0b0b741de-logs\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.705365 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.723287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-config-data-custom\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.730160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxds\" (UniqueName: \"kubernetes.io/projected/dc83f9b6-fbea-4463-8127-08590404f021-kube-api-access-ztxds\") pod \"barbican-worker-7bd6cd4c89-x6dht\" (UID: \"dc83f9b6-fbea-4463-8127-08590404f021\") " pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.734378 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09487f3-5539-4df4-8b9b-6da0b0b741de-combined-ca-bundle\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.750239 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xj8j\" (UniqueName: \"kubernetes.io/projected/e09487f3-5539-4df4-8b9b-6da0b0b741de-kube-api-access-8xj8j\") pod \"barbican-keystone-listener-798f5f6896-mswxw\" (UID: \"e09487f3-5539-4df4-8b9b-6da0b0b741de\") " pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.761211 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.777305 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" path="/var/lib/kubelet/pods/49b7b3ea-3919-4d95-9fc8-138aef12ee08/volumes" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.778048 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.778136 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.779542 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.779821 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.846484 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.848445 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.855384 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.883845 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.883951 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894771 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894804 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894906 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58cj\" (UniqueName: \"kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.894923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.907975 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8d4694bd-z9zk4"] Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998050 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998155 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58cj\" (UniqueName: \"kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998254 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998289 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998420 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998448 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:06 crc kubenswrapper[4954]: I1127 17:00:06.998471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfqc\" (UniqueName: \"kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.008564 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.016309 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.016890 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.017653 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.018313 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.064491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58cj\" (UniqueName: \"kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj\") pod \"dnsmasq-dns-6bb4fc677f-bp5cb\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.104091 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfqc\" (UniqueName: \"kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.104508 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.104658 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.104707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.104808 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.113855 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.114173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.117954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: E1127 17:00:07.118884 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.134918 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.136700 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfqc\" (UniqueName: \"kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc\") pod \"barbican-api-7dd865f898-v599c\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.236531 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.288092 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.516490 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.545225 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.690065 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:07 crc kubenswrapper[4954]: W1127 17:00:07.726364 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1263a5b8_5b99_4f6b_9a43_72532ef791da.slice/crio-83dacaa0c4c6dda897e4b2f0674cae5576279869032c9e992fc5f19d0ff1bfc3 WatchSource:0}: Error finding container 83dacaa0c4c6dda897e4b2f0674cae5576279869032c9e992fc5f19d0ff1bfc3: Status 404 returned error can't find the container with id 83dacaa0c4c6dda897e4b2f0674cae5576279869032c9e992fc5f19d0ff1bfc3 Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.854446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerStarted","Data":"224ba60a7f32d5d7be45bce3c0f5bb8c45baa15970608878e71b56db7020c35f"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.854846 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="ceilometer-notification-agent" containerID="cri-o://633d8889f9cefc99558e758667563a1550d81c6fe739e5cc9b4b8be68d4f31c9" gracePeriod=30 Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.855113 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.855396 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="proxy-httpd" containerID="cri-o://224ba60a7f32d5d7be45bce3c0f5bb8c45baa15970608878e71b56db7020c35f" gracePeriod=30 Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.855437 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="sg-core" containerID="cri-o://2f08b7ce7474b7e23ee72ce58fe803c1638962b91e1c59b50e005beb8b358208" gracePeriod=30 Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.863013 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bd6cd4c89-x6dht"] Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.871251 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerStarted","Data":"78ff8f7feaf561f28485b56b07b8c5921c2a63310d7a9eb735c08179717c5139"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.878902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerStarted","Data":"83dacaa0c4c6dda897e4b2f0674cae5576279869032c9e992fc5f19d0ff1bfc3"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.891226 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" event={"ID":"35048694-881a-428c-b2c8-27e53edd4e5b","Type":"ContainerStarted","Data":"83afc27e906573031be7f63761aad1869f0937bdd327f7633071b47865e2aab7"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.891273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" event={"ID":"35048694-881a-428c-b2c8-27e53edd4e5b","Type":"ContainerStarted","Data":"7426ea31792fdeb853e297d004b7a0b7ecd7275fe31b866742bec977dd35bc28"} Nov 27 17:00:07 crc kubenswrapper[4954]: W1127 17:00:07.906170 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09487f3_5539_4df4_8b9b_6da0b0b741de.slice/crio-e3235f382dbb71e9442cf44b7e2f7bbfb7918e28b1502989f2867b2ca11eb535 WatchSource:0}: Error finding container e3235f382dbb71e9442cf44b7e2f7bbfb7918e28b1502989f2867b2ca11eb535: Status 404 returned error can't find the container with id e3235f382dbb71e9442cf44b7e2f7bbfb7918e28b1502989f2867b2ca11eb535 Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.906445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" event={"ID":"75073c0f-2879-417d-a1a0-9721b37111cb","Type":"ContainerStarted","Data":"acee3f80b1a095df64696d57a8fda59bc03bbcfe4362bb52a5a5146885923b01"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.934240 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8d4694bd-z9zk4" event={"ID":"a4617263-6b9f-4f0c-af69-9d589143eb12","Type":"ContainerStarted","Data":"bee7c2f63d336730917f9cf04fca7ab5d4968fdf625a514d1aff38f1c71240fe"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.934284 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8d4694bd-z9zk4" event={"ID":"a4617263-6b9f-4f0c-af69-9d589143eb12","Type":"ContainerStarted","Data":"f4e1bbfada94d37fb34b5d68e8e846d8239b398eca6cd526630a089cb0877b27"} Nov 27 17:00:07 crc kubenswrapper[4954]: I1127 17:00:07.948964 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f5f6896-mswxw"] Nov 27 17:00:08 crc kubenswrapper[4954]: W1127 17:00:08.183044 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cbaf412_0cf5_4f12_9c3e_ec1a6fe20622.slice/crio-d9c8a63769220c1cfb06f6220d5194b5be77176c9ad09c8a9daad9f2aa3c7455 WatchSource:0}: Error finding container d9c8a63769220c1cfb06f6220d5194b5be77176c9ad09c8a9daad9f2aa3c7455: Status 404 returned error can't find the container with id d9c8a63769220c1cfb06f6220d5194b5be77176c9ad09c8a9daad9f2aa3c7455 Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.185587 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.208110 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.585800 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-5zrkl" podUID="49b7b3ea-3919-4d95-9fc8-138aef12ee08" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.954942 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" event={"ID":"dc83f9b6-fbea-4463-8127-08590404f021","Type":"ContainerStarted","Data":"4bfd42c99d35717e2350635bc8011e555c5013bf03cd5697b6f87313d5d090e1"} Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.968128 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8d4694bd-z9zk4" event={"ID":"a4617263-6b9f-4f0c-af69-9d589143eb12","Type":"ContainerStarted","Data":"57c75ad86a747210b26ef5b3392307f3253264e2d742dcf51287a058aedb2ec7"} Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.969160 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.969194 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.980321 4954 generic.go:334] "Generic (PLEG): container finished" podID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerID="224ba60a7f32d5d7be45bce3c0f5bb8c45baa15970608878e71b56db7020c35f" exitCode=0 Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.980364 4954 generic.go:334] "Generic (PLEG): container finished" podID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerID="2f08b7ce7474b7e23ee72ce58fe803c1638962b91e1c59b50e005beb8b358208" exitCode=2 Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.980418 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerDied","Data":"224ba60a7f32d5d7be45bce3c0f5bb8c45baa15970608878e71b56db7020c35f"} Nov 27 17:00:08 crc kubenswrapper[4954]: I1127 17:00:08.980450 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerDied","Data":"2f08b7ce7474b7e23ee72ce58fe803c1638962b91e1c59b50e005beb8b358208"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.018040 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d8d4694bd-z9zk4" podStartSLOduration=5.018009875 podStartE2EDuration="5.018009875s" podCreationTimestamp="2025-11-27 17:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:09.005104031 +0000 UTC m=+1321.022544331" watchObservedRunningTime="2025-11-27 17:00:09.018009875 +0000 UTC m=+1321.035450175" Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.018468 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerStarted","Data":"6702c382089b2a2cf18100017564f33166df6fdf6628b9efdd555c3c01b55214"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.021564 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" event={"ID":"e09487f3-5539-4df4-8b9b-6da0b0b741de","Type":"ContainerStarted","Data":"e3235f382dbb71e9442cf44b7e2f7bbfb7918e28b1502989f2867b2ca11eb535"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.023533 4954 generic.go:334] "Generic (PLEG): container finished" podID="35048694-881a-428c-b2c8-27e53edd4e5b" containerID="83afc27e906573031be7f63761aad1869f0937bdd327f7633071b47865e2aab7" exitCode=0 Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.023734 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" event={"ID":"35048694-881a-428c-b2c8-27e53edd4e5b","Type":"ContainerDied","Data":"83afc27e906573031be7f63761aad1869f0937bdd327f7633071b47865e2aab7"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.035376 4954 generic.go:334] "Generic (PLEG): container finished" podID="75073c0f-2879-417d-a1a0-9721b37111cb" containerID="2af38ba85189c1fc90987a2e280583686a0c7d3d391b7dc2d66189d93f055823" exitCode=0 Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.035541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" event={"ID":"75073c0f-2879-417d-a1a0-9721b37111cb","Type":"ContainerDied","Data":"2af38ba85189c1fc90987a2e280583686a0c7d3d391b7dc2d66189d93f055823"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.041729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerStarted","Data":"4a941ed82c13a5e0d3b29fad3e924aa553ec9ca74b9c15978a19138ee79bb1e0"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.041786 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerStarted","Data":"702f22cf5179357fb10d6593cbc6ea581e9d6040f1f22b3712673bbf8e1863f4"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.064491 4954 generic.go:334] "Generic (PLEG): container finished" podID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerID="9c743ef97060da9fb87c2fb358eb8560978969d54e6dedf0927a940fd489e3d9" exitCode=0 Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.064564 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" event={"ID":"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622","Type":"ContainerDied","Data":"9c743ef97060da9fb87c2fb358eb8560978969d54e6dedf0927a940fd489e3d9"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.064899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" event={"ID":"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622","Type":"ContainerStarted","Data":"d9c8a63769220c1cfb06f6220d5194b5be77176c9ad09c8a9daad9f2aa3c7455"} Nov 27 17:00:09 crc kubenswrapper[4954]: I1127 17:00:09.103567 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.087553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerStarted","Data":"9deadd7a98d574b0e019378666104646d593d798502265f0e2fadceff0865304"} Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.088056 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api-log" containerID="cri-o://6702c382089b2a2cf18100017564f33166df6fdf6628b9efdd555c3c01b55214" gracePeriod=30 Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.088284 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api" containerID="cri-o://9deadd7a98d574b0e019378666104646d593d798502265f0e2fadceff0865304" gracePeriod=30 Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.088351 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.094638 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerStarted","Data":"139fcdccb0d864121342ec3d927e0adec84b53a8268a7c4f9b27f29d95c721d6"} Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.095792 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.095838 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.109048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" event={"ID":"35048694-881a-428c-b2c8-27e53edd4e5b","Type":"ContainerDied","Data":"7426ea31792fdeb853e297d004b7a0b7ecd7275fe31b866742bec977dd35bc28"} Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.109091 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7426ea31792fdeb853e297d004b7a0b7ecd7275fe31b866742bec977dd35bc28" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.117187 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.117168972 podStartE2EDuration="4.117168972s" podCreationTimestamp="2025-11-27 17:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:10.10677911 +0000 UTC m=+1322.124219420" watchObservedRunningTime="2025-11-27 17:00:10.117168972 +0000 UTC m=+1322.134609262" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.120200 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" event={"ID":"75073c0f-2879-417d-a1a0-9721b37111cb","Type":"ContainerDied","Data":"acee3f80b1a095df64696d57a8fda59bc03bbcfe4362bb52a5a5146885923b01"} Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.120237 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acee3f80b1a095df64696d57a8fda59bc03bbcfe4362bb52a5a5146885923b01" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.129401 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dd865f898-v599c" podStartSLOduration=4.129382019 podStartE2EDuration="4.129382019s" podCreationTimestamp="2025-11-27 17:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:10.127703967 +0000 UTC m=+1322.145144277" watchObservedRunningTime="2025-11-27 17:00:10.129382019 +0000 UTC m=+1322.146822319" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.200733 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.205366 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume\") pod \"35048694-881a-428c-b2c8-27e53edd4e5b\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293143 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293167 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtcb6\" (UniqueName: \"kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293207 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293227 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k78lp\" (UniqueName: \"kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp\") pod \"35048694-881a-428c-b2c8-27e53edd4e5b\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293309 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume\") pod \"35048694-881a-428c-b2c8-27e53edd4e5b\" (UID: \"35048694-881a-428c-b2c8-27e53edd4e5b\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293334 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.293353 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb\") pod \"75073c0f-2879-417d-a1a0-9721b37111cb\" (UID: \"75073c0f-2879-417d-a1a0-9721b37111cb\") " Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.294431 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "35048694-881a-428c-b2c8-27e53edd4e5b" (UID: "35048694-881a-428c-b2c8-27e53edd4e5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.296635 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35048694-881a-428c-b2c8-27e53edd4e5b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.298166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp" (OuterVolumeSpecName: "kube-api-access-k78lp") pod "35048694-881a-428c-b2c8-27e53edd4e5b" (UID: "35048694-881a-428c-b2c8-27e53edd4e5b"). InnerVolumeSpecName "kube-api-access-k78lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.298720 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35048694-881a-428c-b2c8-27e53edd4e5b" (UID: "35048694-881a-428c-b2c8-27e53edd4e5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.302835 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6" (OuterVolumeSpecName: "kube-api-access-vtcb6") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "kube-api-access-vtcb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.331216 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.331252 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.331527 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.331556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config" (OuterVolumeSpecName: "config") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.341352 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75073c0f-2879-417d-a1a0-9721b37111cb" (UID: "75073c0f-2879-417d-a1a0-9721b37111cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.397994 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398351 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtcb6\" (UniqueName: \"kubernetes.io/projected/75073c0f-2879-417d-a1a0-9721b37111cb-kube-api-access-vtcb6\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398368 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398380 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k78lp\" (UniqueName: \"kubernetes.io/projected/35048694-881a-428c-b2c8-27e53edd4e5b-kube-api-access-k78lp\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398392 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35048694-881a-428c-b2c8-27e53edd4e5b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398404 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398417 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:10 crc kubenswrapper[4954]: I1127 17:00:10.398429 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75073c0f-2879-417d-a1a0-9721b37111cb-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.136644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" event={"ID":"dc83f9b6-fbea-4463-8127-08590404f021","Type":"ContainerStarted","Data":"dc562b9cb7105820103a6d60a1f0d260af3151f0c670340e441efb7d4a94a29e"} Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.140388 4954 generic.go:334] "Generic (PLEG): container finished" podID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerID="6702c382089b2a2cf18100017564f33166df6fdf6628b9efdd555c3c01b55214" exitCode=143 Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.140425 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerDied","Data":"6702c382089b2a2cf18100017564f33166df6fdf6628b9efdd555c3c01b55214"} Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.144113 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" event={"ID":"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622","Type":"ContainerStarted","Data":"32788bed775a0ee391ceae4acaebc9d6f12eee80046f5dd9d126e2fbdb50616c"} Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.144520 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.146324 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" event={"ID":"e09487f3-5539-4df4-8b9b-6da0b0b741de","Type":"ContainerStarted","Data":"dd54a5c2db259e47f69eb4b026e289f6280e3f038995e60df22c56e334e91041"} Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.146462 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-fwtgk" Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.146516 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q" Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.183403 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" podStartSLOduration=5.18337289 podStartE2EDuration="5.18337289s" podCreationTimestamp="2025-11-27 17:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:11.167178907 +0000 UTC m=+1323.184619227" watchObservedRunningTime="2025-11-27 17:00:11.18337289 +0000 UTC m=+1323.200813190" Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.237627 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:11 crc kubenswrapper[4954]: I1127 17:00:11.250592 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-fwtgk"] Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.203219 4954 generic.go:334] "Generic (PLEG): container finished" podID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerID="633d8889f9cefc99558e758667563a1550d81c6fe739e5cc9b4b8be68d4f31c9" exitCode=0 Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.203298 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerDied","Data":"633d8889f9cefc99558e758667563a1550d81c6fe739e5cc9b4b8be68d4f31c9"} Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.265869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerStarted","Data":"18b284cdef19cf4f8a5e54d8c66817930c009024cb16eaa404be97257b2636bf"} Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.266206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerStarted","Data":"0a75479b703b924d41d80d3450f177c0402e7d5514657819e3827ea9858e489d"} Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.304852 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" event={"ID":"e09487f3-5539-4df4-8b9b-6da0b0b741de","Type":"ContainerStarted","Data":"8aa9745d95713fc6c59d0174104536331ea0420d406bc39f4c56c1e30ab56d0d"} Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.318873 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.996443002 podStartE2EDuration="7.31885019s" podCreationTimestamp="2025-11-27 17:00:05 +0000 UTC" firstStartedPulling="2025-11-27 17:00:07.757662873 +0000 UTC m=+1319.775103173" lastFinishedPulling="2025-11-27 17:00:10.080070061 +0000 UTC m=+1322.097510361" observedRunningTime="2025-11-27 17:00:12.301020857 +0000 UTC m=+1324.318461157" watchObservedRunningTime="2025-11-27 17:00:12.31885019 +0000 UTC m=+1324.336290490" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.346023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" event={"ID":"dc83f9b6-fbea-4463-8127-08590404f021","Type":"ContainerStarted","Data":"fb1978058d97395b97da66888dc5f9a4b9077601a051b4bf2cb07810be9ef26a"} Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.381413 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-798f5f6896-mswxw" podStartSLOduration=3.604423773 podStartE2EDuration="6.381390859s" podCreationTimestamp="2025-11-27 17:00:06 +0000 UTC" firstStartedPulling="2025-11-27 17:00:07.921947351 +0000 UTC m=+1319.939387651" lastFinishedPulling="2025-11-27 17:00:10.698914437 +0000 UTC m=+1322.716354737" observedRunningTime="2025-11-27 17:00:12.366800315 +0000 UTC m=+1324.384240615" watchObservedRunningTime="2025-11-27 17:00:12.381390859 +0000 UTC m=+1324.398831159" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.419066 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7bd6cd4c89-x6dht" podStartSLOduration=3.4784459229999998 podStartE2EDuration="6.419044453s" podCreationTimestamp="2025-11-27 17:00:06 +0000 UTC" firstStartedPulling="2025-11-27 17:00:07.916812186 +0000 UTC m=+1319.934252486" lastFinishedPulling="2025-11-27 17:00:10.857410716 +0000 UTC m=+1322.874851016" observedRunningTime="2025-11-27 17:00:12.398455733 +0000 UTC m=+1324.415896033" watchObservedRunningTime="2025-11-27 17:00:12.419044453 +0000 UTC m=+1324.436484753" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.692173 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75073c0f-2879-417d-a1a0-9721b37111cb" path="/var/lib/kubelet/pods/75073c0f-2879-417d-a1a0-9721b37111cb/volumes" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.721393 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.870892 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.870952 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.870981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.871059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.871185 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.871276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvpn\" (UniqueName: \"kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.871315 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data\") pod \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\" (UID: \"70a1a927-b24a-4da3-93f1-9dc67f75c4ba\") " Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.871754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.872056 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.885170 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts" (OuterVolumeSpecName: "scripts") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.910827 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn" (OuterVolumeSpecName: "kube-api-access-gwvpn") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "kube-api-access-gwvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.946673 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973768 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvpn\" (UniqueName: \"kubernetes.io/projected/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-kube-api-access-gwvpn\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973798 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973808 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973818 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973828 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.973911 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.987893 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c4fd9778-zrzw7"] Nov 27 17:00:12 crc kubenswrapper[4954]: E1127 17:00:12.989029 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="ceilometer-notification-agent" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.989144 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="ceilometer-notification-agent" Nov 27 17:00:12 crc kubenswrapper[4954]: E1127 17:00:12.989223 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="proxy-httpd" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.989288 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="proxy-httpd" Nov 27 17:00:12 crc kubenswrapper[4954]: E1127 17:00:12.989368 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75073c0f-2879-417d-a1a0-9721b37111cb" containerName="init" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.989524 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="75073c0f-2879-417d-a1a0-9721b37111cb" containerName="init" Nov 27 17:00:12 crc kubenswrapper[4954]: E1127 17:00:12.989736 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="sg-core" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.989818 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="sg-core" Nov 27 17:00:12 crc kubenswrapper[4954]: E1127 17:00:12.989898 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35048694-881a-428c-b2c8-27e53edd4e5b" containerName="collect-profiles" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.989988 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="35048694-881a-428c-b2c8-27e53edd4e5b" containerName="collect-profiles" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.990362 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="ceilometer-notification-agent" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.990455 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="proxy-httpd" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.990570 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="75073c0f-2879-417d-a1a0-9721b37111cb" containerName="init" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.990686 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="35048694-881a-428c-b2c8-27e53edd4e5b" containerName="collect-profiles" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.990769 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" containerName="sg-core" Nov 27 17:00:12 crc kubenswrapper[4954]: I1127 17:00:12.992884 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:12.995462 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c4fd9778-zrzw7"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.000420 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.000606 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.005266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data" (OuterVolumeSpecName: "config-data") pod "70a1a927-b24a-4da3-93f1-9dc67f75c4ba" (UID: "70a1a927-b24a-4da3-93f1-9dc67f75c4ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.010643 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.075493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts\") pod \"c11e3407-a026-4236-97b2-e2afbcd50035\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.075820 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs\") pod \"c11e3407-a026-4236-97b2-e2afbcd50035\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.075943 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74lwt\" (UniqueName: \"kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt\") pod \"c11e3407-a026-4236-97b2-e2afbcd50035\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076053 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data\") pod \"c11e3407-a026-4236-97b2-e2afbcd50035\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076152 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key\") pod \"c11e3407-a026-4236-97b2-e2afbcd50035\" (UID: \"c11e3407-a026-4236-97b2-e2afbcd50035\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzrd\" (UniqueName: \"kubernetes.io/projected/3e0b062d-ff7b-4acc-8857-f463ec1bc195-kube-api-access-6mzrd\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076688 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0b062d-ff7b-4acc-8857-f463ec1bc195-logs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076805 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-internal-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.076915 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-combined-ca-bundle\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077007 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data-custom\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077079 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-public-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077199 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077310 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077371 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a1a927-b24a-4da3-93f1-9dc67f75c4ba-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.077569 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs" (OuterVolumeSpecName: "logs") pod "c11e3407-a026-4236-97b2-e2afbcd50035" (UID: "c11e3407-a026-4236-97b2-e2afbcd50035"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.083362 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt" (OuterVolumeSpecName: "kube-api-access-74lwt") pod "c11e3407-a026-4236-97b2-e2afbcd50035" (UID: "c11e3407-a026-4236-97b2-e2afbcd50035"). InnerVolumeSpecName "kube-api-access-74lwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.103459 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts" (OuterVolumeSpecName: "scripts") pod "c11e3407-a026-4236-97b2-e2afbcd50035" (UID: "c11e3407-a026-4236-97b2-e2afbcd50035"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.116061 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c11e3407-a026-4236-97b2-e2afbcd50035" (UID: "c11e3407-a026-4236-97b2-e2afbcd50035"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.121647 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data" (OuterVolumeSpecName: "config-data") pod "c11e3407-a026-4236-97b2-e2afbcd50035" (UID: "c11e3407-a026-4236-97b2-e2afbcd50035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178670 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0b062d-ff7b-4acc-8857-f463ec1bc195-logs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-internal-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-combined-ca-bundle\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178839 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data-custom\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-public-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178929 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.178976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzrd\" (UniqueName: \"kubernetes.io/projected/3e0b062d-ff7b-4acc-8857-f463ec1bc195-kube-api-access-6mzrd\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179085 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74lwt\" (UniqueName: \"kubernetes.io/projected/c11e3407-a026-4236-97b2-e2afbcd50035-kube-api-access-74lwt\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179099 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179109 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c11e3407-a026-4236-97b2-e2afbcd50035-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179122 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11e3407-a026-4236-97b2-e2afbcd50035-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179131 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11e3407-a026-4236-97b2-e2afbcd50035-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.179099 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0b062d-ff7b-4acc-8857-f463ec1bc195-logs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.184253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-internal-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.186759 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-combined-ca-bundle\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.186996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data-custom\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.191186 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-public-tls-certs\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.193080 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0b062d-ff7b-4acc-8857-f463ec1bc195-config-data\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.199999 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.201016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzrd\" (UniqueName: \"kubernetes.io/projected/3e0b062d-ff7b-4acc-8857-f463ec1bc195-kube-api-access-6mzrd\") pod \"barbican-api-7c4fd9778-zrzw7\" (UID: \"3e0b062d-ff7b-4acc-8857-f463ec1bc195\") " pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.265042 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.281110 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt87d\" (UniqueName: \"kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d\") pod \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.281204 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key\") pod \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.281233 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs\") pod \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.281264 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data\") pod \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.281322 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts\") pod \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\" (UID: \"685e0c55-4605-4b5b-9d32-89d0e92fe52a\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.290887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs" (OuterVolumeSpecName: "logs") pod "685e0c55-4605-4b5b-9d32-89d0e92fe52a" (UID: "685e0c55-4605-4b5b-9d32-89d0e92fe52a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.296065 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d" (OuterVolumeSpecName: "kube-api-access-dt87d") pod "685e0c55-4605-4b5b-9d32-89d0e92fe52a" (UID: "685e0c55-4605-4b5b-9d32-89d0e92fe52a"). InnerVolumeSpecName "kube-api-access-dt87d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.300849 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "685e0c55-4605-4b5b-9d32-89d0e92fe52a" (UID: "685e0c55-4605-4b5b-9d32-89d0e92fe52a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.314485 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts" (OuterVolumeSpecName: "scripts") pod "685e0c55-4605-4b5b-9d32-89d0e92fe52a" (UID: "685e0c55-4605-4b5b-9d32-89d0e92fe52a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.319286 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data" (OuterVolumeSpecName: "config-data") pod "685e0c55-4605-4b5b-9d32-89d0e92fe52a" (UID: "685e0c55-4605-4b5b-9d32-89d0e92fe52a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.339677 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.360852 4954 generic.go:334] "Generic (PLEG): container finished" podID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerID="42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.360897 4954 generic.go:334] "Generic (PLEG): container finished" podID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerID="4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.360920 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-855db5c9c7-gpqq9" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.360977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerDied","Data":"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.361006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerDied","Data":"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.361018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-855db5c9c7-gpqq9" event={"ID":"685e0c55-4605-4b5b-9d32-89d0e92fe52a","Type":"ContainerDied","Data":"11ad4323a5e416ca719b04d6c01019c65e9ffa62165f9484cd91acb38d2f9534"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.361035 4954 scope.go:117] "RemoveContainer" containerID="42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365121 4954 generic.go:334] "Generic (PLEG): container finished" podID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerID="f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365148 4954 generic.go:334] "Generic (PLEG): container finished" podID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerID="712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerDied","Data":"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365214 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerDied","Data":"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365239 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755474f4f-2m4z8" event={"ID":"774fb5a2-9809-4297-9ad1-f68e130747bd","Type":"ContainerDied","Data":"893e4b6b9f0db46d5f301dd77b2142ff779c7899525a862710260fac030a505b"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.365296 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755474f4f-2m4z8" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.376876 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70a1a927-b24a-4da3-93f1-9dc67f75c4ba","Type":"ContainerDied","Data":"aa71c878602ead6424f853ebde04dab25d210b4621eaaccc7043e43ef149d74e"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.376967 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.383321 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs\") pod \"774fb5a2-9809-4297-9ad1-f68e130747bd\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.383535 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key\") pod \"774fb5a2-9809-4297-9ad1-f68e130747bd\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.383626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pwdc\" (UniqueName: \"kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc\") pod \"774fb5a2-9809-4297-9ad1-f68e130747bd\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.383870 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data\") pod \"774fb5a2-9809-4297-9ad1-f68e130747bd\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.383935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts\") pod \"774fb5a2-9809-4297-9ad1-f68e130747bd\" (UID: \"774fb5a2-9809-4297-9ad1-f68e130747bd\") " Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.384286 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.384297 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt87d\" (UniqueName: \"kubernetes.io/projected/685e0c55-4605-4b5b-9d32-89d0e92fe52a-kube-api-access-dt87d\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.384307 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/685e0c55-4605-4b5b-9d32-89d0e92fe52a-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.384316 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/685e0c55-4605-4b5b-9d32-89d0e92fe52a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.384326 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/685e0c55-4605-4b5b-9d32-89d0e92fe52a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.386572 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs" (OuterVolumeSpecName: "logs") pod "774fb5a2-9809-4297-9ad1-f68e130747bd" (UID: "774fb5a2-9809-4297-9ad1-f68e130747bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.390169 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc" (OuterVolumeSpecName: "kube-api-access-8pwdc") pod "774fb5a2-9809-4297-9ad1-f68e130747bd" (UID: "774fb5a2-9809-4297-9ad1-f68e130747bd"). InnerVolumeSpecName "kube-api-access-8pwdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.411163 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "774fb5a2-9809-4297-9ad1-f68e130747bd" (UID: "774fb5a2-9809-4297-9ad1-f68e130747bd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.414450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts" (OuterVolumeSpecName: "scripts") pod "774fb5a2-9809-4297-9ad1-f68e130747bd" (UID: "774fb5a2-9809-4297-9ad1-f68e130747bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.429474 4954 generic.go:334] "Generic (PLEG): container finished" podID="c11e3407-a026-4236-97b2-e2afbcd50035" containerID="eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.429516 4954 generic.go:334] "Generic (PLEG): container finished" podID="c11e3407-a026-4236-97b2-e2afbcd50035" containerID="5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" exitCode=137 Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.429903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerDied","Data":"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.430149 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerDied","Data":"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.430266 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d65d5b797-gbgfp" event={"ID":"c11e3407-a026-4236-97b2-e2afbcd50035","Type":"ContainerDied","Data":"176e892ba92bc2b4935c93bac62e5c457994b02c4c2c742e0a0a4e60e3da3400"} Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.430322 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d65d5b797-gbgfp" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.431594 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data" (OuterVolumeSpecName: "config-data") pod "774fb5a2-9809-4297-9ad1-f68e130747bd" (UID: "774fb5a2-9809-4297-9ad1-f68e130747bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.493141 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.494658 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/774fb5a2-9809-4297-9ad1-f68e130747bd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.494714 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pwdc\" (UniqueName: \"kubernetes.io/projected/774fb5a2-9809-4297-9ad1-f68e130747bd-kube-api-access-8pwdc\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.494733 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.494745 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/774fb5a2-9809-4297-9ad1-f68e130747bd-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.494757 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774fb5a2-9809-4297-9ad1-f68e130747bd-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.543633 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-855db5c9c7-gpqq9"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.628753 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.639226 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.647887 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.656316 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d65d5b797-gbgfp"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.665501 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666328 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666350 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666369 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666379 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666394 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666401 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666416 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666424 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666452 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666459 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.666480 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666489 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666729 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666749 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666767 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666794 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666808 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" containerName="horizon" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.666817 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" containerName="horizon-log" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.669227 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.672883 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.673998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.675987 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.693268 4954 scope.go:117] "RemoveContainer" containerID="4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.788530 4954 scope.go:117] "RemoveContainer" containerID="42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.791939 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9\": container with ID starting with 42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9 not found: ID does not exist" containerID="42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.791965 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9"} err="failed to get container status \"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9\": rpc error: code = NotFound desc = could not find container \"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9\": container with ID starting with 42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9 not found: ID does not exist" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.791988 4954 scope.go:117] "RemoveContainer" containerID="4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" Nov 27 17:00:13 crc kubenswrapper[4954]: E1127 17:00:13.792423 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806\": container with ID starting with 4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806 not found: ID does not exist" containerID="4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.792445 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806"} err="failed to get container status \"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806\": rpc error: code = NotFound desc = could not find container \"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806\": container with ID starting with 4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806 not found: ID does not exist" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.792458 4954 scope.go:117] "RemoveContainer" containerID="42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.801741 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9"} err="failed to get container status \"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9\": rpc error: code = NotFound desc = could not find container \"42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9\": container with ID starting with 42b4af116bb7855f59b6f7cd5f94af60a6c4e0e4c80d89405354db59051977a9 not found: ID does not exist" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.801813 4954 scope.go:117] "RemoveContainer" containerID="4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.802982 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803020 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhhh\" (UniqueName: \"kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803112 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803182 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803218 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.803236 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.807260 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806"} err="failed to get container status \"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806\": rpc error: code = NotFound desc = could not find container \"4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806\": container with ID starting with 4e00fbbe3e64fddcf3145b8df712fac0de2cfddc23e1b49acdc7eeb62b736806 not found: ID does not exist" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.807311 4954 scope.go:117] "RemoveContainer" containerID="f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.833633 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.843879 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7755474f4f-2m4z8"] Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905290 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905371 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905411 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905435 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhhh\" (UniqueName: \"kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905461 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905504 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.905944 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.906281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.912281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.914040 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.917164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.917391 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:13 crc kubenswrapper[4954]: I1127 17:00:13.934289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhhh\" (UniqueName: \"kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh\") pod \"ceilometer-0\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " pod="openstack/ceilometer-0" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.007597 4954 scope.go:117] "RemoveContainer" containerID="712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.029876 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c4fd9778-zrzw7"] Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.041806 4954 scope.go:117] "RemoveContainer" containerID="f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" Nov 27 17:00:14 crc kubenswrapper[4954]: E1127 17:00:14.049776 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590\": container with ID starting with f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590 not found: ID does not exist" containerID="f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.049819 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590"} err="failed to get container status \"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590\": rpc error: code = NotFound desc = could not find container \"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590\": container with ID starting with f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.049842 4954 scope.go:117] "RemoveContainer" containerID="712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" Nov 27 17:00:14 crc kubenswrapper[4954]: E1127 17:00:14.050163 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b\": container with ID starting with 712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b not found: ID does not exist" containerID="712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.050205 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b"} err="failed to get container status \"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b\": rpc error: code = NotFound desc = could not find container \"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b\": container with ID starting with 712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.050235 4954 scope.go:117] "RemoveContainer" containerID="f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.050828 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590"} err="failed to get container status \"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590\": rpc error: code = NotFound desc = could not find container \"f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590\": container with ID starting with f3a8129ba1e86d9bfbcf2021bbb6169428adbcb40d0ca9cf6b1424bf9e7b4590 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.050871 4954 scope.go:117] "RemoveContainer" containerID="712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.051708 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b"} err="failed to get container status \"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b\": rpc error: code = NotFound desc = could not find container \"712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b\": container with ID starting with 712680d835819aee27a83517fb33a16baafb7282ebdfef9ccc4c446e6be07c5b not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.051733 4954 scope.go:117] "RemoveContainer" containerID="224ba60a7f32d5d7be45bce3c0f5bb8c45baa15970608878e71b56db7020c35f" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.102532 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.111886 4954 scope.go:117] "RemoveContainer" containerID="2f08b7ce7474b7e23ee72ce58fe803c1638962b91e1c59b50e005beb8b358208" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.201076 4954 scope.go:117] "RemoveContainer" containerID="633d8889f9cefc99558e758667563a1550d81c6fe739e5cc9b4b8be68d4f31c9" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.293034 4954 scope.go:117] "RemoveContainer" containerID="eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.441844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c4fd9778-zrzw7" event={"ID":"3e0b062d-ff7b-4acc-8857-f463ec1bc195","Type":"ContainerStarted","Data":"33c6453d5b41407ae73775ad1d5ddae64ba8717abefd985839aabac9138feb7a"} Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.442203 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c4fd9778-zrzw7" event={"ID":"3e0b062d-ff7b-4acc-8857-f463ec1bc195","Type":"ContainerStarted","Data":"090c819657a09492f75927947375e215af3b43ac2efbb01f955c151a54bfb67b"} Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.595384 4954 scope.go:117] "RemoveContainer" containerID="5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.621024 4954 scope.go:117] "RemoveContainer" containerID="eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" Nov 27 17:00:14 crc kubenswrapper[4954]: E1127 17:00:14.622023 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0\": container with ID starting with eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0 not found: ID does not exist" containerID="eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.622077 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0"} err="failed to get container status \"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0\": rpc error: code = NotFound desc = could not find container \"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0\": container with ID starting with eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.622111 4954 scope.go:117] "RemoveContainer" containerID="5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" Nov 27 17:00:14 crc kubenswrapper[4954]: E1127 17:00:14.625267 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1\": container with ID starting with 5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1 not found: ID does not exist" containerID="5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.625294 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1"} err="failed to get container status \"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1\": rpc error: code = NotFound desc = could not find container \"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1\": container with ID starting with 5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.625308 4954 scope.go:117] "RemoveContainer" containerID="eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.625776 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0"} err="failed to get container status \"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0\": rpc error: code = NotFound desc = could not find container \"eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0\": container with ID starting with eab4355348599e86835e5067f14f768d3df623dd5f12f4c2410564ae15ea6da0 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.625827 4954 scope.go:117] "RemoveContainer" containerID="5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.626436 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1"} err="failed to get container status \"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1\": rpc error: code = NotFound desc = could not find container \"5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1\": container with ID starting with 5df76f4334bd357eb6cebc103462a801082c650d9b107a62dfe11a9c72d9b5f1 not found: ID does not exist" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.645282 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:14 crc kubenswrapper[4954]: W1127 17:00:14.649154 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17240c1b_4f70_4182_9b68_dac293e719ef.slice/crio-13dfcd5d65844e7994519832b449846122a5dd2b06008a3efc635205dc2e0612 WatchSource:0}: Error finding container 13dfcd5d65844e7994519832b449846122a5dd2b06008a3efc635205dc2e0612: Status 404 returned error can't find the container with id 13dfcd5d65844e7994519832b449846122a5dd2b06008a3efc635205dc2e0612 Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.692206 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685e0c55-4605-4b5b-9d32-89d0e92fe52a" path="/var/lib/kubelet/pods/685e0c55-4605-4b5b-9d32-89d0e92fe52a/volumes" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.693352 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a1a927-b24a-4da3-93f1-9dc67f75c4ba" path="/var/lib/kubelet/pods/70a1a927-b24a-4da3-93f1-9dc67f75c4ba/volumes" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.697393 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774fb5a2-9809-4297-9ad1-f68e130747bd" path="/var/lib/kubelet/pods/774fb5a2-9809-4297-9ad1-f68e130747bd/volumes" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.698372 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11e3407-a026-4236-97b2-e2afbcd50035" path="/var/lib/kubelet/pods/c11e3407-a026-4236-97b2-e2afbcd50035/volumes" Nov 27 17:00:14 crc kubenswrapper[4954]: I1127 17:00:14.849340 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5657c85556-sq27w" Nov 27 17:00:15 crc kubenswrapper[4954]: I1127 17:00:15.278748 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 17:00:15 crc kubenswrapper[4954]: I1127 17:00:15.358261 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 17:00:15 crc kubenswrapper[4954]: I1127 17:00:15.457030 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerStarted","Data":"13dfcd5d65844e7994519832b449846122a5dd2b06008a3efc635205dc2e0612"} Nov 27 17:00:15 crc kubenswrapper[4954]: I1127 17:00:15.459521 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c4fd9778-zrzw7" event={"ID":"3e0b062d-ff7b-4acc-8857-f463ec1bc195","Type":"ContainerStarted","Data":"03d99f486a3edd779bfae2d6b691e6a6ae4499836cdc6b4d888a31f33d8aa1cc"} Nov 27 17:00:15 crc kubenswrapper[4954]: I1127 17:00:15.459727 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.339430 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.474715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerStarted","Data":"2967004ba4d0c484dac64a4095d2441b118048a9ca2019e6d66ce83a98affb2c"} Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.475910 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.600516 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.629500 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c4fd9778-zrzw7" podStartSLOduration=4.629472585 podStartE2EDuration="4.629472585s" podCreationTimestamp="2025-11-27 17:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:15.485213922 +0000 UTC m=+1327.502654232" watchObservedRunningTime="2025-11-27 17:00:16.629472585 +0000 UTC m=+1328.646912905" Nov 27 17:00:16 crc kubenswrapper[4954]: I1127 17:00:16.673873 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.200924 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b5c6d8894-l7bzv" Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.237748 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.310195 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.310506 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon-log" containerID="cri-o://4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.310678 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" containerID="cri-o://51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.320283 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.366247 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.366537 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="dnsmasq-dns" containerID="cri-o://a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0" gracePeriod=10 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.498349 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerStarted","Data":"8e96b0218c2672d5d2090beae4454f978aebd79bd9a90af1b6fd7218d671a402"} Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.498600 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="cinder-scheduler" containerID="cri-o://0a75479b703b924d41d80d3450f177c0402e7d5514657819e3827ea9858e489d" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.498885 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="probe" containerID="cri-o://18b284cdef19cf4f8a5e54d8c66817930c009024cb16eaa404be97257b2636bf" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.623908 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f6cfb75df-7gbdb" Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.702859 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.703133 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5657c85556-sq27w" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-api" containerID="cri-o://48a8afd527e4cec4880cfb003cfaec9aff3c9c8c58342474a1b77e5c21366e88" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.703672 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5657c85556-sq27w" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-httpd" containerID="cri-o://4393c3ec99187d36baafd3a746d542662ad0eb0e5ceb35b60f2d4a600e291fea" gracePeriod=30 Nov 27 17:00:17 crc kubenswrapper[4954]: I1127 17:00:17.956959 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.100888 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ts49\" (UniqueName: \"kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.101014 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.101063 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.101139 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.101212 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.101269 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb\") pod \"a242acad-7998-4797-930e-9a119e9b0e64\" (UID: \"a242acad-7998-4797-930e-9a119e9b0e64\") " Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.114785 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49" (OuterVolumeSpecName: "kube-api-access-9ts49") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "kube-api-access-9ts49". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.185977 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.195133 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.195938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.200943 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.203852 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.203887 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.203900 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ts49\" (UniqueName: \"kubernetes.io/projected/a242acad-7998-4797-930e-9a119e9b0e64-kube-api-access-9ts49\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.203909 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.203918 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.236078 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config" (OuterVolumeSpecName: "config") pod "a242acad-7998-4797-930e-9a119e9b0e64" (UID: "a242acad-7998-4797-930e-9a119e9b0e64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.306826 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a242acad-7998-4797-930e-9a119e9b0e64-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.511848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerStarted","Data":"12df96fa1529840c2542da2f12b4b5888999179ebfe4e624d0c2f43b34579a56"} Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.514420 4954 generic.go:334] "Generic (PLEG): container finished" podID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerID="18b284cdef19cf4f8a5e54d8c66817930c009024cb16eaa404be97257b2636bf" exitCode=0 Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.514487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerDied","Data":"18b284cdef19cf4f8a5e54d8c66817930c009024cb16eaa404be97257b2636bf"} Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.517076 4954 generic.go:334] "Generic (PLEG): container finished" podID="a242acad-7998-4797-930e-9a119e9b0e64" containerID="a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0" exitCode=0 Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.517195 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.517966 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" event={"ID":"a242acad-7998-4797-930e-9a119e9b0e64","Type":"ContainerDied","Data":"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0"} Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.518012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mnxqd" event={"ID":"a242acad-7998-4797-930e-9a119e9b0e64","Type":"ContainerDied","Data":"f85787f4e18e6e7396428fe7d34c41066bba9fea06f15c773646e7e0912a7dff"} Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.518030 4954 scope.go:117] "RemoveContainer" containerID="a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.521476 4954 generic.go:334] "Generic (PLEG): container finished" podID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerID="4393c3ec99187d36baafd3a746d542662ad0eb0e5ceb35b60f2d4a600e291fea" exitCode=0 Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.521527 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerDied","Data":"4393c3ec99187d36baafd3a746d542662ad0eb0e5ceb35b60f2d4a600e291fea"} Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.547460 4954 scope.go:117] "RemoveContainer" containerID="bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.570961 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.577699 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mnxqd"] Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.581012 4954 scope.go:117] "RemoveContainer" containerID="a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0" Nov 27 17:00:18 crc kubenswrapper[4954]: E1127 17:00:18.581559 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0\": container with ID starting with a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0 not found: ID does not exist" containerID="a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.581609 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0"} err="failed to get container status \"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0\": rpc error: code = NotFound desc = could not find container \"a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0\": container with ID starting with a334c38c0087fa3fc6cd017dd11ef8c06ad22c59644c5f3d92de9e0596138ac0 not found: ID does not exist" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.581634 4954 scope.go:117] "RemoveContainer" containerID="bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6" Nov 27 17:00:18 crc kubenswrapper[4954]: E1127 17:00:18.582151 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6\": container with ID starting with bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6 not found: ID does not exist" containerID="bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.582190 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6"} err="failed to get container status \"bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6\": rpc error: code = NotFound desc = could not find container \"bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6\": container with ID starting with bbbbd79913829e10336f0c4cded2ede7c078ba0b870ebc8251a7f3cc15844be6 not found: ID does not exist" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.693412 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a242acad-7998-4797-930e-9a119e9b0e64" path="/var/lib/kubelet/pods/a242acad-7998-4797-930e-9a119e9b0e64/volumes" Nov 27 17:00:18 crc kubenswrapper[4954]: I1127 17:00:18.966570 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.158664 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.420992 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.570698 4954 generic.go:334] "Generic (PLEG): container finished" podID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerID="0a75479b703b924d41d80d3450f177c0402e7d5514657819e3827ea9858e489d" exitCode=0 Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.571329 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerDied","Data":"0a75479b703b924d41d80d3450f177c0402e7d5514657819e3827ea9858e489d"} Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.802254 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949138 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949232 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949281 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949375 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgfj\" (UniqueName: \"kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.949515 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom\") pod \"1263a5b8-5b99-4f6b-9a43-72532ef791da\" (UID: \"1263a5b8-5b99-4f6b-9a43-72532ef791da\") " Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.955691 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.968785 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj" (OuterVolumeSpecName: "kube-api-access-rkgfj") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "kube-api-access-rkgfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.968848 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:00:19 crc kubenswrapper[4954]: I1127 17:00:19.973789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts" (OuterVolumeSpecName: "scripts") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.039717 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.053807 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.053844 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgfj\" (UniqueName: \"kubernetes.io/projected/1263a5b8-5b99-4f6b-9a43-72532ef791da-kube-api-access-rkgfj\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.053858 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.053866 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.053875 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1263a5b8-5b99-4f6b-9a43-72532ef791da-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.097692 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data" (OuterVolumeSpecName: "config-data") pod "1263a5b8-5b99-4f6b-9a43-72532ef791da" (UID: "1263a5b8-5b99-4f6b-9a43-72532ef791da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.157162 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1263a5b8-5b99-4f6b-9a43-72532ef791da-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.595075 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerStarted","Data":"2fee71cb3e89f8003fca463e9dcf7f53c3d706a8908b14bb805576f60944fe1c"} Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.595369 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.610062 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.610137 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1263a5b8-5b99-4f6b-9a43-72532ef791da","Type":"ContainerDied","Data":"83dacaa0c4c6dda897e4b2f0674cae5576279869032c9e992fc5f19d0ff1bfc3"} Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.610249 4954 scope.go:117] "RemoveContainer" containerID="18b284cdef19cf4f8a5e54d8c66817930c009024cb16eaa404be97257b2636bf" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.636436 4954 generic.go:334] "Generic (PLEG): container finished" podID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerID="48a8afd527e4cec4880cfb003cfaec9aff3c9c8c58342474a1b77e5c21366e88" exitCode=0 Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.636643 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerDied","Data":"48a8afd527e4cec4880cfb003cfaec9aff3c9c8c58342474a1b77e5c21366e88"} Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.636806 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.84179069 podStartE2EDuration="7.636792786s" podCreationTimestamp="2025-11-27 17:00:13 +0000 UTC" firstStartedPulling="2025-11-27 17:00:14.652757319 +0000 UTC m=+1326.670197619" lastFinishedPulling="2025-11-27 17:00:19.447759415 +0000 UTC m=+1331.465199715" observedRunningTime="2025-11-27 17:00:20.627239514 +0000 UTC m=+1332.644679814" watchObservedRunningTime="2025-11-27 17:00:20.636792786 +0000 UTC m=+1332.654233086" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.683632 4954 scope.go:117] "RemoveContainer" containerID="0a75479b703b924d41d80d3450f177c0402e7d5514657819e3827ea9858e489d" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.694842 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.694882 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.698193 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:20 crc kubenswrapper[4954]: E1127 17:00:20.698655 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="cinder-scheduler" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.698670 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="cinder-scheduler" Nov 27 17:00:20 crc kubenswrapper[4954]: E1127 17:00:20.698974 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="dnsmasq-dns" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.698990 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="dnsmasq-dns" Nov 27 17:00:20 crc kubenswrapper[4954]: E1127 17:00:20.699007 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="init" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.699017 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="init" Nov 27 17:00:20 crc kubenswrapper[4954]: E1127 17:00:20.699029 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="probe" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.699036 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="probe" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.699241 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="probe" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.704045 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a242acad-7998-4797-930e-9a119e9b0e64" containerName="dnsmasq-dns" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.704083 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" containerName="cinder-scheduler" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.705517 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.708204 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.710031 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.786476 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:42620->10.217.0.145:8443: read: connection reset by peer" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.787324 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5657c85556-sq27w" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.878247 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qblq4\" (UniqueName: \"kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4\") pod \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.878365 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle\") pod \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.879825 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs\") pod \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880017 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config\") pod \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config\") pod \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\" (UID: \"5b3b9061-c3c7-43bb-b5bd-cafef342fde0\") " Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880519 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880825 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880898 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880929 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skg55\" (UniqueName: \"kubernetes.io/projected/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-kube-api-access-skg55\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.880966 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.881033 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.901745 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5b3b9061-c3c7-43bb-b5bd-cafef342fde0" (UID: "5b3b9061-c3c7-43bb-b5bd-cafef342fde0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.902240 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4" (OuterVolumeSpecName: "kube-api-access-qblq4") pod "5b3b9061-c3c7-43bb-b5bd-cafef342fde0" (UID: "5b3b9061-c3c7-43bb-b5bd-cafef342fde0"). InnerVolumeSpecName "kube-api-access-qblq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.939895 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config" (OuterVolumeSpecName: "config") pod "5b3b9061-c3c7-43bb-b5bd-cafef342fde0" (UID: "5b3b9061-c3c7-43bb-b5bd-cafef342fde0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.952743 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b3b9061-c3c7-43bb-b5bd-cafef342fde0" (UID: "5b3b9061-c3c7-43bb-b5bd-cafef342fde0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982271 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skg55\" (UniqueName: \"kubernetes.io/projected/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-kube-api-access-skg55\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982371 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982538 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982557 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982589 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qblq4\" (UniqueName: \"kubernetes.io/projected/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-kube-api-access-qblq4\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982602 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.982668 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.986018 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.987070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.990354 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5b3b9061-c3c7-43bb-b5bd-cafef342fde0" (UID: "5b3b9061-c3c7-43bb-b5bd-cafef342fde0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.991019 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:20 crc kubenswrapper[4954]: I1127 17:00:20.995383 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.002063 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skg55\" (UniqueName: \"kubernetes.io/projected/fa6f325f-3f75-4d35-9ffa-3298dc1a936e-kube-api-access-skg55\") pod \"cinder-scheduler-0\" (UID: \"fa6f325f-3f75-4d35-9ffa-3298dc1a936e\") " pod="openstack/cinder-scheduler-0" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.085386 4954 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3b9061-c3c7-43bb-b5bd-cafef342fde0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.121533 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.607099 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.655686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5657c85556-sq27w" event={"ID":"5b3b9061-c3c7-43bb-b5bd-cafef342fde0","Type":"ContainerDied","Data":"38e064080c4eac7f0c9b0fda1b374b21fff521b22ee9dccf9164f69f74aa4514"} Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.655739 4954 scope.go:117] "RemoveContainer" containerID="4393c3ec99187d36baafd3a746d542662ad0eb0e5ceb35b60f2d4a600e291fea" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.655833 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5657c85556-sq27w" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.674455 4954 generic.go:334] "Generic (PLEG): container finished" podID="8a9e455d-383c-460b-897e-2234c0611a83" containerID="51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe" exitCode=0 Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.674549 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerDied","Data":"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe"} Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.681813 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa6f325f-3f75-4d35-9ffa-3298dc1a936e","Type":"ContainerStarted","Data":"c3f8922344b53183580ecc41d3b026cd8ee272cead4891d65369b0a321b69b15"} Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.830795 4954 scope.go:117] "RemoveContainer" containerID="48a8afd527e4cec4880cfb003cfaec9aff3c9c8c58342474a1b77e5c21366e88" Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.836624 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 17:00:21 crc kubenswrapper[4954]: I1127 17:00:21.851992 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5657c85556-sq27w"] Nov 27 17:00:22 crc kubenswrapper[4954]: I1127 17:00:22.544720 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Nov 27 17:00:22 crc kubenswrapper[4954]: I1127 17:00:22.621986 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-68775c76df-2ppbs" Nov 27 17:00:22 crc kubenswrapper[4954]: I1127 17:00:22.687911 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1263a5b8-5b99-4f6b-9a43-72532ef791da" path="/var/lib/kubelet/pods/1263a5b8-5b99-4f6b-9a43-72532ef791da/volumes" Nov 27 17:00:22 crc kubenswrapper[4954]: I1127 17:00:22.688982 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" path="/var/lib/kubelet/pods/5b3b9061-c3c7-43bb-b5bd-cafef342fde0/volumes" Nov 27 17:00:22 crc kubenswrapper[4954]: I1127 17:00:22.739382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa6f325f-3f75-4d35-9ffa-3298dc1a936e","Type":"ContainerStarted","Data":"06ff9fd3080d8cc63ed4cb6fd965af2c3265dccc2f0f920d8e0a88e7762e7afe"} Nov 27 17:00:25 crc kubenswrapper[4954]: I1127 17:00:25.786881 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa6f325f-3f75-4d35-9ffa-3298dc1a936e","Type":"ContainerStarted","Data":"592e004b570210b903e2472432767147e9686237a5b086516e97a8f132676e83"} Nov 27 17:00:25 crc kubenswrapper[4954]: I1127 17:00:25.813531 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.81351364 podStartE2EDuration="5.81351364s" podCreationTimestamp="2025-11-27 17:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:25.80609761 +0000 UTC m=+1337.823537910" watchObservedRunningTime="2025-11-27 17:00:25.81351364 +0000 UTC m=+1337.830953940" Nov 27 17:00:25 crc kubenswrapper[4954]: I1127 17:00:25.980668 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.024829 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c4fd9778-zrzw7" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.091716 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.092033 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dd865f898-v599c" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api-log" containerID="cri-o://4a941ed82c13a5e0d3b29fad3e924aa553ec9ca74b9c15978a19138ee79bb1e0" gracePeriod=30 Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.092384 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dd865f898-v599c" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api" containerID="cri-o://139fcdccb0d864121342ec3d927e0adec84b53a8268a7c4f9b27f29d95c721d6" gracePeriod=30 Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.121682 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.814348 4954 generic.go:334] "Generic (PLEG): container finished" podID="733ded25-88fc-4c78-9939-d983d7c473cf" containerID="4a941ed82c13a5e0d3b29fad3e924aa553ec9ca74b9c15978a19138ee79bb1e0" exitCode=143 Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.815285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerDied","Data":"4a941ed82c13a5e0d3b29fad3e924aa553ec9ca74b9c15978a19138ee79bb1e0"} Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.987919 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 27 17:00:26 crc kubenswrapper[4954]: E1127 17:00:26.992367 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-httpd" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.992399 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-httpd" Nov 27 17:00:26 crc kubenswrapper[4954]: E1127 17:00:26.992434 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-api" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.992444 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-api" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.992754 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-httpd" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.992776 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3b9061-c3c7-43bb-b5bd-cafef342fde0" containerName="neutron-api" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.993808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.999159 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.999427 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ss5bc" Nov 27 17:00:26 crc kubenswrapper[4954]: I1127 17:00:26.999766 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.041963 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.125572 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86rj\" (UniqueName: \"kubernetes.io/projected/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-kube-api-access-m86rj\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.125865 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.126142 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.126202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config-secret\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.227407 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.227456 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config-secret\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.227540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m86rj\" (UniqueName: \"kubernetes.io/projected/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-kube-api-access-m86rj\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.227579 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.228468 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.234760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.246308 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-openstack-config-secret\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.247566 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86rj\" (UniqueName: \"kubernetes.io/projected/871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3-kube-api-access-m86rj\") pod \"openstackclient\" (UID: \"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3\") " pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.336175 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 17:00:27 crc kubenswrapper[4954]: I1127 17:00:27.925653 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 17:00:27 crc kubenswrapper[4954]: W1127 17:00:27.931746 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871d6a1f_a817_45c5_a3f5_3f0e47ef9bf3.slice/crio-33497e9b2521dcd96cdca975000f35acf9aa9c19bbd85881fa6ca571f6b52fbb WatchSource:0}: Error finding container 33497e9b2521dcd96cdca975000f35acf9aa9c19bbd85881fa6ca571f6b52fbb: Status 404 returned error can't find the container with id 33497e9b2521dcd96cdca975000f35acf9aa9c19bbd85881fa6ca571f6b52fbb Nov 27 17:00:28 crc kubenswrapper[4954]: I1127 17:00:28.832014 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3","Type":"ContainerStarted","Data":"33497e9b2521dcd96cdca975000f35acf9aa9c19bbd85881fa6ca571f6b52fbb"} Nov 27 17:00:29 crc kubenswrapper[4954]: I1127 17:00:29.640389 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dd865f898-v599c" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:40374->10.217.0.162:9311: read: connection reset by peer" Nov 27 17:00:29 crc kubenswrapper[4954]: I1127 17:00:29.640389 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dd865f898-v599c" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:40372->10.217.0.162:9311: read: connection reset by peer" Nov 27 17:00:29 crc kubenswrapper[4954]: I1127 17:00:29.845679 4954 generic.go:334] "Generic (PLEG): container finished" podID="733ded25-88fc-4c78-9939-d983d7c473cf" containerID="139fcdccb0d864121342ec3d927e0adec84b53a8268a7c4f9b27f29d95c721d6" exitCode=0 Nov 27 17:00:29 crc kubenswrapper[4954]: I1127 17:00:29.845769 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerDied","Data":"139fcdccb0d864121342ec3d927e0adec84b53a8268a7c4f9b27f29d95c721d6"} Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.279350 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.387450 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle\") pod \"733ded25-88fc-4c78-9939-d983d7c473cf\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.388099 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data\") pod \"733ded25-88fc-4c78-9939-d983d7c473cf\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.388124 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfqc\" (UniqueName: \"kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc\") pod \"733ded25-88fc-4c78-9939-d983d7c473cf\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.388201 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs\") pod \"733ded25-88fc-4c78-9939-d983d7c473cf\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.388262 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom\") pod \"733ded25-88fc-4c78-9939-d983d7c473cf\" (UID: \"733ded25-88fc-4c78-9939-d983d7c473cf\") " Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.389283 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs" (OuterVolumeSpecName: "logs") pod "733ded25-88fc-4c78-9939-d983d7c473cf" (UID: "733ded25-88fc-4c78-9939-d983d7c473cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.396038 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "733ded25-88fc-4c78-9939-d983d7c473cf" (UID: "733ded25-88fc-4c78-9939-d983d7c473cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.401357 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc" (OuterVolumeSpecName: "kube-api-access-pkfqc") pod "733ded25-88fc-4c78-9939-d983d7c473cf" (UID: "733ded25-88fc-4c78-9939-d983d7c473cf"). InnerVolumeSpecName "kube-api-access-pkfqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.427734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "733ded25-88fc-4c78-9939-d983d7c473cf" (UID: "733ded25-88fc-4c78-9939-d983d7c473cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.460113 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data" (OuterVolumeSpecName: "config-data") pod "733ded25-88fc-4c78-9939-d983d7c473cf" (UID: "733ded25-88fc-4c78-9939-d983d7c473cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.490740 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.490781 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfqc\" (UniqueName: \"kubernetes.io/projected/733ded25-88fc-4c78-9939-d983d7c473cf-kube-api-access-pkfqc\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.490795 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733ded25-88fc-4c78-9939-d983d7c473cf-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.490807 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.490817 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733ded25-88fc-4c78-9939-d983d7c473cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.862293 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dd865f898-v599c" event={"ID":"733ded25-88fc-4c78-9939-d983d7c473cf","Type":"ContainerDied","Data":"702f22cf5179357fb10d6593cbc6ea581e9d6040f1f22b3712673bbf8e1863f4"} Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.862359 4954 scope.go:117] "RemoveContainer" containerID="139fcdccb0d864121342ec3d927e0adec84b53a8268a7c4f9b27f29d95c721d6" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.862557 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dd865f898-v599c" Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.902647 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.914544 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7dd865f898-v599c"] Nov 27 17:00:30 crc kubenswrapper[4954]: I1127 17:00:30.915711 4954 scope.go:117] "RemoveContainer" containerID="4a941ed82c13a5e0d3b29fad3e924aa553ec9ca74b9c15978a19138ee79bb1e0" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.362504 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.533698 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.535855 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="sg-core" containerID="cri-o://12df96fa1529840c2542da2f12b4b5888999179ebfe4e624d0c2f43b34579a56" gracePeriod=30 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.535979 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" containerID="cri-o://2fee71cb3e89f8003fca463e9dcf7f53c3d706a8908b14bb805576f60944fe1c" gracePeriod=30 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.536126 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-notification-agent" containerID="cri-o://8e96b0218c2672d5d2090beae4454f978aebd79bd9a90af1b6fd7218d671a402" gracePeriod=30 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.536199 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-central-agent" containerID="cri-o://2967004ba4d0c484dac64a4095d2441b118048a9ca2019e6d66ce83a98affb2c" gracePeriod=30 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.547506 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": EOF" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.633541 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85cf58799f-l72lc"] Nov 27 17:00:31 crc kubenswrapper[4954]: E1127 17:00:31.633952 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.633975 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api" Nov 27 17:00:31 crc kubenswrapper[4954]: E1127 17:00:31.634000 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api-log" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.634008 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api-log" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.634228 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.634249 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" containerName="barbican-api-log" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.635301 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.636867 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.647852 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85cf58799f-l72lc"] Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.649944 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.650147 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.715744 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-run-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.715787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-combined-ca-bundle\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.715875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-config-data\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.715906 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-log-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.716029 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zg2b\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-kube-api-access-7zg2b\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.716051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-internal-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.716067 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-etc-swift\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.716084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-public-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821484 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zg2b\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-kube-api-access-7zg2b\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821542 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-internal-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821565 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-etc-swift\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821642 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-public-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821685 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-run-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-combined-ca-bundle\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821749 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-config-data\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.821783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-log-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.822270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-log-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.822832 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9de053dc-d10c-4999-9019-f7221fb9e237-run-httpd\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.829788 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-public-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.831542 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-etc-swift\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.833199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-internal-tls-certs\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.838245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-combined-ca-bundle\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.839671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de053dc-d10c-4999-9019-f7221fb9e237-config-data\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.846432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zg2b\" (UniqueName: \"kubernetes.io/projected/9de053dc-d10c-4999-9019-f7221fb9e237-kube-api-access-7zg2b\") pod \"swift-proxy-85cf58799f-l72lc\" (UID: \"9de053dc-d10c-4999-9019-f7221fb9e237\") " pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.880213 4954 generic.go:334] "Generic (PLEG): container finished" podID="17240c1b-4f70-4182-9b68-dac293e719ef" containerID="2fee71cb3e89f8003fca463e9dcf7f53c3d706a8908b14bb805576f60944fe1c" exitCode=0 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.880248 4954 generic.go:334] "Generic (PLEG): container finished" podID="17240c1b-4f70-4182-9b68-dac293e719ef" containerID="12df96fa1529840c2542da2f12b4b5888999179ebfe4e624d0c2f43b34579a56" exitCode=2 Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.880269 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerDied","Data":"2fee71cb3e89f8003fca463e9dcf7f53c3d706a8908b14bb805576f60944fe1c"} Nov 27 17:00:31 crc kubenswrapper[4954]: I1127 17:00:31.880297 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerDied","Data":"12df96fa1529840c2542da2f12b4b5888999179ebfe4e624d0c2f43b34579a56"} Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.003315 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.544766 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.604482 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85cf58799f-l72lc"] Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.675483 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733ded25-88fc-4c78-9939-d983d7c473cf" path="/var/lib/kubelet/pods/733ded25-88fc-4c78-9939-d983d7c473cf/volumes" Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.891327 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85cf58799f-l72lc" event={"ID":"9de053dc-d10c-4999-9019-f7221fb9e237","Type":"ContainerStarted","Data":"2469632fde06d6afb310b4f0e52a9c07e8d9799e222fda458d89a8b0fe1aa6a7"} Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.894349 4954 generic.go:334] "Generic (PLEG): container finished" podID="17240c1b-4f70-4182-9b68-dac293e719ef" containerID="2967004ba4d0c484dac64a4095d2441b118048a9ca2019e6d66ce83a98affb2c" exitCode=0 Nov 27 17:00:32 crc kubenswrapper[4954]: I1127 17:00:32.894397 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerDied","Data":"2967004ba4d0c484dac64a4095d2441b118048a9ca2019e6d66ce83a98affb2c"} Nov 27 17:00:33 crc kubenswrapper[4954]: I1127 17:00:33.906089 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85cf58799f-l72lc" event={"ID":"9de053dc-d10c-4999-9019-f7221fb9e237","Type":"ContainerStarted","Data":"08d6ee5f0c9d79e8abfbf2ff486ee7f0d339805821df9fb2c259da36b5b02642"} Nov 27 17:00:35 crc kubenswrapper[4954]: I1127 17:00:35.930814 4954 generic.go:334] "Generic (PLEG): container finished" podID="17240c1b-4f70-4182-9b68-dac293e719ef" containerID="8e96b0218c2672d5d2090beae4454f978aebd79bd9a90af1b6fd7218d671a402" exitCode=0 Nov 27 17:00:35 crc kubenswrapper[4954]: I1127 17:00:35.931258 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerDied","Data":"8e96b0218c2672d5d2090beae4454f978aebd79bd9a90af1b6fd7218d671a402"} Nov 27 17:00:41 crc kubenswrapper[4954]: I1127 17:00:41.508698 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.158:8776/healthcheck\": dial tcp 10.217.0.158:8776: connect: connection refused" Nov 27 17:00:42 crc kubenswrapper[4954]: I1127 17:00:42.544125 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6549c6cdd4-szxmh" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Nov 27 17:00:43 crc kubenswrapper[4954]: I1127 17:00:43.033038 4954 generic.go:334] "Generic (PLEG): container finished" podID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerID="9deadd7a98d574b0e019378666104646d593d798502265f0e2fadceff0865304" exitCode=137 Nov 27 17:00:43 crc kubenswrapper[4954]: I1127 17:00:43.033157 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerDied","Data":"9deadd7a98d574b0e019378666104646d593d798502265f0e2fadceff0865304"} Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.050005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85cf58799f-l72lc" event={"ID":"9de053dc-d10c-4999-9019-f7221fb9e237","Type":"ContainerStarted","Data":"1fccf84de804efe80fa71efd2ba798dd98e0ba3058d00ef429d701d09a19b087"} Nov 27 17:00:44 crc kubenswrapper[4954]: E1127 17:00:44.258642 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Nov 27 17:00:44 crc kubenswrapper[4954]: E1127 17:00:44.259278 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64bh65dh5c8hc5h75h5f6h5c8h546h5ddh686h5cch647h659h6fh5b5h557h694h5d6h566h86h55fh94h9h8hc8h5f9h568hc4h68bh566h549h679q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m86rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 17:00:44 crc kubenswrapper[4954]: E1127 17:00:44.260834 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.289272 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.298866 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373384 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373436 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhhh\" (UniqueName: \"kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373485 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373533 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373557 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373631 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373685 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373791 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373817 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373842 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373867 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlzqm\" (UniqueName: \"kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data\") pod \"528d738e-43f9-4b32-be5a-b557c9d94d63\" (UID: \"528d738e-43f9-4b32-be5a-b557c9d94d63\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.373997 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts\") pod \"17240c1b-4f70-4182-9b68-dac293e719ef\" (UID: \"17240c1b-4f70-4182-9b68-dac293e719ef\") " Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.375493 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.376304 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.376561 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.377656 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs" (OuterVolumeSpecName: "logs") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.381935 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm" (OuterVolumeSpecName: "kube-api-access-qlzqm") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "kube-api-access-qlzqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.382142 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts" (OuterVolumeSpecName: "scripts") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.382512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts" (OuterVolumeSpecName: "scripts") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.383893 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.389316 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh" (OuterVolumeSpecName: "kube-api-access-kkhhh") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "kube-api-access-kkhhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.414041 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.414254 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.452051 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data" (OuterVolumeSpecName: "config-data") pod "528d738e-43f9-4b32-be5a-b557c9d94d63" (UID: "528d738e-43f9-4b32-be5a-b557c9d94d63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485769 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485838 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485856 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/528d738e-43f9-4b32-be5a-b557c9d94d63-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485866 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485880 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485890 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17240c1b-4f70-4182-9b68-dac293e719ef-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485901 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.485912 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlzqm\" (UniqueName: \"kubernetes.io/projected/528d738e-43f9-4b32-be5a-b557c9d94d63-kube-api-access-qlzqm\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.486696 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528d738e-43f9-4b32-be5a-b557c9d94d63-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.486753 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.486766 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhhh\" (UniqueName: \"kubernetes.io/projected/17240c1b-4f70-4182-9b68-dac293e719ef-kube-api-access-kkhhh\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.486813 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/528d738e-43f9-4b32-be5a-b557c9d94d63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.499059 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.514151 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data" (OuterVolumeSpecName: "config-data") pod "17240c1b-4f70-4182-9b68-dac293e719ef" (UID: "17240c1b-4f70-4182-9b68-dac293e719ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.588145 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:44 crc kubenswrapper[4954]: I1127 17:00:44.588189 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17240c1b-4f70-4182-9b68-dac293e719ef-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.064630 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17240c1b-4f70-4182-9b68-dac293e719ef","Type":"ContainerDied","Data":"13dfcd5d65844e7994519832b449846122a5dd2b06008a3efc635205dc2e0612"} Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.064678 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.064714 4954 scope.go:117] "RemoveContainer" containerID="2fee71cb3e89f8003fca463e9dcf7f53c3d706a8908b14bb805576f60944fe1c" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.069091 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.069098 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"528d738e-43f9-4b32-be5a-b557c9d94d63","Type":"ContainerDied","Data":"78ff8f7feaf561f28485b56b07b8c5921c2a63310d7a9eb735c08179717c5139"} Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.069345 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.069385 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.075073 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.084807 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-85cf58799f-l72lc" podUID="9de053dc-d10c-4999-9019-f7221fb9e237" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.100323 4954 scope.go:117] "RemoveContainer" containerID="12df96fa1529840c2542da2f12b4b5888999179ebfe4e624d0c2f43b34579a56" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.108210 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85cf58799f-l72lc" podStartSLOduration=14.108185299 podStartE2EDuration="14.108185299s" podCreationTimestamp="2025-11-27 17:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:45.09999809 +0000 UTC m=+1357.117438400" watchObservedRunningTime="2025-11-27 17:00:45.108185299 +0000 UTC m=+1357.125625619" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.129253 4954 scope.go:117] "RemoveContainer" containerID="8e96b0218c2672d5d2090beae4454f978aebd79bd9a90af1b6fd7218d671a402" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.150171 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.160542 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.168863 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.177893 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.185838 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186319 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-central-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186341 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-central-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186357 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186365 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186404 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api-log" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186411 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api-log" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186424 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-notification-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186432 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-notification-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186445 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="sg-core" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186453 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="sg-core" Nov 27 17:00:45 crc kubenswrapper[4954]: E1127 17:00:45.186465 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186473 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186685 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-notification-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186705 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="sg-core" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186715 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api-log" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186727 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="ceilometer-central-agent" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186747 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" containerName="cinder-api" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.186758 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.187963 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.190482 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.192282 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.192451 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.207242 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.225885 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.228690 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.232394 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.235799 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.239266 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301048 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301115 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-scripts\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301382 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmb2\" (UniqueName: \"kubernetes.io/projected/9d6609b2-5156-4d39-b4fd-05cb39b98915-kube-api-access-ztmb2\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6609b2-5156-4d39-b4fd-05cb39b98915-logs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301499 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zbrl\" (UniqueName: \"kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301559 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301599 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301705 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301727 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d6609b2-5156-4d39-b4fd-05cb39b98915-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301746 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.301926 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.302109 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.315878 4954 scope.go:117] "RemoveContainer" containerID="2967004ba4d0c484dac64a4095d2441b118048a9ca2019e6d66ce83a98affb2c" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.335160 4954 scope.go:117] "RemoveContainer" containerID="9deadd7a98d574b0e019378666104646d593d798502265f0e2fadceff0865304" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.360639 4954 scope.go:117] "RemoveContainer" containerID="6702c382089b2a2cf18100017564f33166df6fdf6628b9efdd555c3c01b55214" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.403938 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404119 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404138 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404158 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d6609b2-5156-4d39-b4fd-05cb39b98915-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404257 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404283 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404315 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-scripts\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404371 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmb2\" (UniqueName: \"kubernetes.io/projected/9d6609b2-5156-4d39-b4fd-05cb39b98915-kube-api-access-ztmb2\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404389 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404404 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6609b2-5156-4d39-b4fd-05cb39b98915-logs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404428 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zbrl\" (UniqueName: \"kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.404753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d6609b2-5156-4d39-b4fd-05cb39b98915-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.405156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.405573 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6609b2-5156-4d39-b4fd-05cb39b98915-logs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.405883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.408187 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.408827 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-scripts\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.408973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.410131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.410928 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.412140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-config-data\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.414824 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.415359 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.415848 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6609b2-5156-4d39-b4fd-05cb39b98915-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.416189 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.426882 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmb2\" (UniqueName: \"kubernetes.io/projected/9d6609b2-5156-4d39-b4fd-05cb39b98915-kube-api-access-ztmb2\") pod \"cinder-api-0\" (UID: \"9d6609b2-5156-4d39-b4fd-05cb39b98915\") " pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.433970 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zbrl\" (UniqueName: \"kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl\") pod \"ceilometer-0\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " pod="openstack/ceilometer-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.482952 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.492263 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d8d4694bd-z9zk4" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.555063 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 17:00:45 crc kubenswrapper[4954]: I1127 17:00:45.563656 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:46 crc kubenswrapper[4954]: I1127 17:00:46.094845 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:46 crc kubenswrapper[4954]: I1127 17:00:46.179651 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:46 crc kubenswrapper[4954]: W1127 17:00:46.182414 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda37d925e_cb07_4949_b184_eb778ffb662f.slice/crio-1bb2c3f5b54e624a3060bff22521f11d56f987ae6b411dc83d57d077ec813c23 WatchSource:0}: Error finding container 1bb2c3f5b54e624a3060bff22521f11d56f987ae6b411dc83d57d077ec813c23: Status 404 returned error can't find the container with id 1bb2c3f5b54e624a3060bff22521f11d56f987ae6b411dc83d57d077ec813c23 Nov 27 17:00:46 crc kubenswrapper[4954]: W1127 17:00:46.259354 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d6609b2_5156_4d39_b4fd_05cb39b98915.slice/crio-91f2d5f6687a8787f59d9b0fa5fbeb4c21fdef12036d625c3e224adac1eb984e WatchSource:0}: Error finding container 91f2d5f6687a8787f59d9b0fa5fbeb4c21fdef12036d625c3e224adac1eb984e: Status 404 returned error can't find the container with id 91f2d5f6687a8787f59d9b0fa5fbeb4c21fdef12036d625c3e224adac1eb984e Nov 27 17:00:46 crc kubenswrapper[4954]: I1127 17:00:46.261181 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 17:00:46 crc kubenswrapper[4954]: I1127 17:00:46.684251 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" path="/var/lib/kubelet/pods/17240c1b-4f70-4182-9b68-dac293e719ef/volumes" Nov 27 17:00:46 crc kubenswrapper[4954]: I1127 17:00:46.685399 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528d738e-43f9-4b32-be5a-b557c9d94d63" path="/var/lib/kubelet/pods/528d738e-43f9-4b32-be5a-b557c9d94d63/volumes" Nov 27 17:00:47 crc kubenswrapper[4954]: I1127 17:00:47.098330 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerStarted","Data":"1bb2c3f5b54e624a3060bff22521f11d56f987ae6b411dc83d57d077ec813c23"} Nov 27 17:00:47 crc kubenswrapper[4954]: I1127 17:00:47.100825 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d6609b2-5156-4d39-b4fd-05cb39b98915","Type":"ContainerStarted","Data":"91f2d5f6687a8787f59d9b0fa5fbeb4c21fdef12036d625c3e224adac1eb984e"} Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.103028 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.120537 4954 generic.go:334] "Generic (PLEG): container finished" podID="8a9e455d-383c-460b-897e-2234c0611a83" containerID="4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226" exitCode=137 Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.120622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerDied","Data":"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226"} Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.120704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549c6cdd4-szxmh" event={"ID":"8a9e455d-383c-460b-897e-2234c0611a83","Type":"ContainerDied","Data":"486d8412009a68bdc35ae95e26ee40d6f77d4a5c03e7e5b470ef0632abe3bea0"} Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.120746 4954 scope.go:117] "RemoveContainer" containerID="51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.120654 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549c6cdd4-szxmh" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.128204 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d6609b2-5156-4d39-b4fd-05cb39b98915","Type":"ContainerStarted","Data":"1451bc48b8f8883fffaba60fa34f02d1a77eb82113f03e095b619bad220e55fd"} Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.159819 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.159897 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.159918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.160009 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.160046 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.160160 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhgz8\" (UniqueName: \"kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.160254 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts\") pod \"8a9e455d-383c-460b-897e-2234c0611a83\" (UID: \"8a9e455d-383c-460b-897e-2234c0611a83\") " Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.161757 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs" (OuterVolumeSpecName: "logs") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.166915 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8" (OuterVolumeSpecName: "kube-api-access-lhgz8") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "kube-api-access-lhgz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.176736 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.199398 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts" (OuterVolumeSpecName: "scripts") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.200744 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data" (OuterVolumeSpecName: "config-data") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.201796 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.232439 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8a9e455d-383c-460b-897e-2234c0611a83" (UID: "8a9e455d-383c-460b-897e-2234c0611a83"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263531 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263596 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263608 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhgz8\" (UniqueName: \"kubernetes.io/projected/8a9e455d-383c-460b-897e-2234c0611a83-kube-api-access-lhgz8\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263621 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9e455d-383c-460b-897e-2234c0611a83-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263632 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263643 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e455d-383c-460b-897e-2234c0611a83-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.263655 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a9e455d-383c-460b-897e-2234c0611a83-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.452455 4954 scope.go:117] "RemoveContainer" containerID="4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226" Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.531712 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.542801 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6549c6cdd4-szxmh"] Nov 27 17:00:48 crc kubenswrapper[4954]: I1127 17:00:48.676247 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9e455d-383c-460b-897e-2234c0611a83" path="/var/lib/kubelet/pods/8a9e455d-383c-460b-897e-2234c0611a83/volumes" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.076490 4954 scope.go:117] "RemoveContainer" containerID="51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe" Nov 27 17:00:49 crc kubenswrapper[4954]: E1127 17:00:49.077572 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe\": container with ID starting with 51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe not found: ID does not exist" containerID="51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.077659 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe"} err="failed to get container status \"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe\": rpc error: code = NotFound desc = could not find container \"51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe\": container with ID starting with 51fc083b73e2dbbfc048368e65a84834c859cc6a3b10dd95d2a2cc01a0184dbe not found: ID does not exist" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.077703 4954 scope.go:117] "RemoveContainer" containerID="4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226" Nov 27 17:00:49 crc kubenswrapper[4954]: E1127 17:00:49.079305 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226\": container with ID starting with 4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226 not found: ID does not exist" containerID="4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.079363 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226"} err="failed to get container status \"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226\": rpc error: code = NotFound desc = could not find container \"4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226\": container with ID starting with 4b3a9c94ec8c6148f1f0656db217d02cc6a5f9806343ef93871772f9909f3226 not found: ID does not exist" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.157178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d6609b2-5156-4d39-b4fd-05cb39b98915","Type":"ContainerStarted","Data":"5d0da6224e35527ea9308487879a073ef551267a816be4d6687bd3f4ae44e810"} Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.161795 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 27 17:00:49 crc kubenswrapper[4954]: I1127 17:00:49.202687 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.202657276 podStartE2EDuration="4.202657276s" podCreationTimestamp="2025-11-27 17:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:49.187163929 +0000 UTC m=+1361.204604259" watchObservedRunningTime="2025-11-27 17:00:49.202657276 +0000 UTC m=+1361.220097576" Nov 27 17:00:50 crc kubenswrapper[4954]: I1127 17:00:50.169869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerStarted","Data":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} Nov 27 17:00:51 crc kubenswrapper[4954]: I1127 17:00:51.182725 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerStarted","Data":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} Nov 27 17:00:52 crc kubenswrapper[4954]: I1127 17:00:52.013120 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85cf58799f-l72lc" Nov 27 17:00:52 crc kubenswrapper[4954]: I1127 17:00:52.195504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerStarted","Data":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} Nov 27 17:00:52 crc kubenswrapper[4954]: I1127 17:00:52.392242 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:00:57 crc kubenswrapper[4954]: I1127 17:00:57.465649 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerStarted","Data":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262311 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-central-agent" containerID="cri-o://d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" gracePeriod=30 Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262415 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-notification-agent" containerID="cri-o://6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" gracePeriod=30 Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262467 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="sg-core" containerID="cri-o://ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" gracePeriod=30 Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262576 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.262694 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="proxy-httpd" containerID="cri-o://ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" gracePeriod=30 Nov 27 17:00:58 crc kubenswrapper[4954]: I1127 17:00:58.295607 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.627303913 podStartE2EDuration="13.295551286s" podCreationTimestamp="2025-11-27 17:00:45 +0000 UTC" firstStartedPulling="2025-11-27 17:00:46.185079427 +0000 UTC m=+1358.202519727" lastFinishedPulling="2025-11-27 17:00:54.8533268 +0000 UTC m=+1366.870767100" observedRunningTime="2025-11-27 17:00:58.2854576 +0000 UTC m=+1370.302897910" watchObservedRunningTime="2025-11-27 17:00:58.295551286 +0000 UTC m=+1370.312991576" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.270926 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275175 4954 generic.go:334] "Generic (PLEG): container finished" podID="a37d925e-cb07-4949-b184-eb778ffb662f" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" exitCode=0 Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275213 4954 generic.go:334] "Generic (PLEG): container finished" podID="a37d925e-cb07-4949-b184-eb778ffb662f" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" exitCode=2 Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275223 4954 generic.go:334] "Generic (PLEG): container finished" podID="a37d925e-cb07-4949-b184-eb778ffb662f" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" exitCode=0 Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275233 4954 generic.go:334] "Generic (PLEG): container finished" podID="a37d925e-cb07-4949-b184-eb778ffb662f" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" exitCode=0 Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275261 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerDied","Data":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275327 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerDied","Data":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerDied","Data":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerDied","Data":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a37d925e-cb07-4949-b184-eb778ffb662f","Type":"ContainerDied","Data":"1bb2c3f5b54e624a3060bff22521f11d56f987ae6b411dc83d57d077ec813c23"} Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.275382 4954 scope.go:117] "RemoveContainer" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.306107 4954 scope.go:117] "RemoveContainer" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.327764 4954 scope.go:117] "RemoveContainer" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.349949 4954 scope.go:117] "RemoveContainer" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.376864 4954 scope.go:117] "RemoveContainer" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: E1127 17:00:59.391757 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": container with ID starting with ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744 not found: ID does not exist" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.391802 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} err="failed to get container status \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": rpc error: code = NotFound desc = could not find container \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": container with ID starting with ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.391832 4954 scope.go:117] "RemoveContainer" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: E1127 17:00:59.392100 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": container with ID starting with ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204 not found: ID does not exist" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392115 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} err="failed to get container status \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": rpc error: code = NotFound desc = could not find container \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": container with ID starting with ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392128 4954 scope.go:117] "RemoveContainer" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: E1127 17:00:59.392335 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": container with ID starting with 6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764 not found: ID does not exist" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392365 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} err="failed to get container status \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": rpc error: code = NotFound desc = could not find container \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": container with ID starting with 6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392379 4954 scope.go:117] "RemoveContainer" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: E1127 17:00:59.392602 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": container with ID starting with d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0 not found: ID does not exist" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392622 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} err="failed to get container status \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": rpc error: code = NotFound desc = could not find container \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": container with ID starting with d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392634 4954 scope.go:117] "RemoveContainer" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392846 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} err="failed to get container status \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": rpc error: code = NotFound desc = could not find container \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": container with ID starting with ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.392867 4954 scope.go:117] "RemoveContainer" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393068 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} err="failed to get container status \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": rpc error: code = NotFound desc = could not find container \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": container with ID starting with ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393088 4954 scope.go:117] "RemoveContainer" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393420 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} err="failed to get container status \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": rpc error: code = NotFound desc = could not find container \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": container with ID starting with 6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393480 4954 scope.go:117] "RemoveContainer" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393772 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393858 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zbrl\" (UniqueName: \"kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.393967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.394023 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.394082 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.394119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.394143 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml\") pod \"a37d925e-cb07-4949-b184-eb778ffb662f\" (UID: \"a37d925e-cb07-4949-b184-eb778ffb662f\") " Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.395225 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} err="failed to get container status \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": rpc error: code = NotFound desc = could not find container \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": container with ID starting with d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.395261 4954 scope.go:117] "RemoveContainer" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.395310 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.395336 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.395980 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} err="failed to get container status \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": rpc error: code = NotFound desc = could not find container \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": container with ID starting with ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.396021 4954 scope.go:117] "RemoveContainer" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.397542 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} err="failed to get container status \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": rpc error: code = NotFound desc = could not find container \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": container with ID starting with ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.397590 4954 scope.go:117] "RemoveContainer" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.399692 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} err="failed to get container status \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": rpc error: code = NotFound desc = could not find container \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": container with ID starting with 6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.399746 4954 scope.go:117] "RemoveContainer" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.400398 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts" (OuterVolumeSpecName: "scripts") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.400474 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl" (OuterVolumeSpecName: "kube-api-access-9zbrl") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "kube-api-access-9zbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.405241 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} err="failed to get container status \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": rpc error: code = NotFound desc = could not find container \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": container with ID starting with d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.405287 4954 scope.go:117] "RemoveContainer" containerID="ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.405763 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744"} err="failed to get container status \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": rpc error: code = NotFound desc = could not find container \"ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744\": container with ID starting with ee5862dea77d36bbea6640dc3def534c730933dc9c6db3051d75128d3f3ed744 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.405821 4954 scope.go:117] "RemoveContainer" containerID="ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.408000 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204"} err="failed to get container status \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": rpc error: code = NotFound desc = could not find container \"ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204\": container with ID starting with ca6de7a5c11ede7a9be5ee0264e071fd4812a10b1bc0979f5978455579672204 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.408040 4954 scope.go:117] "RemoveContainer" containerID="6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.408356 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764"} err="failed to get container status \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": rpc error: code = NotFound desc = could not find container \"6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764\": container with ID starting with 6953f4b86194ad91176128d46312e54852405a783db3c45d77c8b016cc223764 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.408373 4954 scope.go:117] "RemoveContainer" containerID="d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.409851 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0"} err="failed to get container status \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": rpc error: code = NotFound desc = could not find container \"d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0\": container with ID starting with d5ed237b8c4d92b060bd2e3da6867546776c2bf9dc62aa1191f8538b971c66f0 not found: ID does not exist" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.432829 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.475048 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503445 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503483 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503496 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503512 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zbrl\" (UniqueName: \"kubernetes.io/projected/a37d925e-cb07-4949-b184-eb778ffb662f-kube-api-access-9zbrl\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503524 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a37d925e-cb07-4949-b184-eb778ffb662f-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.503536 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.512047 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data" (OuterVolumeSpecName: "config-data") pod "a37d925e-cb07-4949-b184-eb778ffb662f" (UID: "a37d925e-cb07-4949-b184-eb778ffb662f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:59 crc kubenswrapper[4954]: I1127 17:00:59.605552 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37d925e-cb07-4949-b184-eb778ffb662f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.147365 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29404381-h5mc2"] Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-notification-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149210 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-notification-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149220 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon-log" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149229 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon-log" Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149259 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-central-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149268 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-central-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149286 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="proxy-httpd" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149294 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="proxy-httpd" Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149316 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149325 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" Nov 27 17:01:00 crc kubenswrapper[4954]: E1127 17:01:00.149333 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="sg-core" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149341 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="sg-core" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149563 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-central-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149600 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="proxy-httpd" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149616 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="sg-core" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149629 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149640 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" containerName="ceilometer-notification-agent" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.149659 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9e455d-383c-460b-897e-2234c0611a83" containerName="horizon-log" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.150460 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.156836 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404381-h5mc2"] Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.256660 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcf7\" (UniqueName: \"kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.256752 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.256775 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.256807 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.287132 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.328708 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.342178 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.353425 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.355677 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.364780 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.365269 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.365746 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.365833 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.365873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.365930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366168 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f8fn\" (UniqueName: \"kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366225 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366285 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcf7\" (UniqueName: \"kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366363 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.366401 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.382299 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.382954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.382981 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.390972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.395642 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcf7\" (UniqueName: \"kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7\") pod \"keystone-cron-29404381-h5mc2\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.468863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.468939 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f8fn\" (UniqueName: \"kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.468976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.469011 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.469033 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.469057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.469181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.470071 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.471456 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.473441 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.473871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.473905 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.476163 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.485699 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f8fn\" (UniqueName: \"kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn\") pod \"ceilometer-0\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " pod="openstack/ceilometer-0" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.575524 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.683082 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37d925e-cb07-4949-b184-eb778ffb662f" path="/var/lib/kubelet/pods/a37d925e-cb07-4949-b184-eb778ffb662f/volumes" Nov 27 17:01:00 crc kubenswrapper[4954]: I1127 17:01:00.753779 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:01 crc kubenswrapper[4954]: W1127 17:01:01.158563 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa450761_82d0_4005_aee7_bcb56c03a5fd.slice/crio-49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4 WatchSource:0}: Error finding container 49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4: Status 404 returned error can't find the container with id 49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4 Nov 27 17:01:01 crc kubenswrapper[4954]: I1127 17:01:01.158793 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404381-h5mc2"] Nov 27 17:01:01 crc kubenswrapper[4954]: W1127 17:01:01.297953 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod649d4e07_ae06_43ca_a97d_adaba927165c.slice/crio-75d9db3196ab9b05f3565c88cfc2d7fb6c0d33d9cca401ffa712e47dfb748855 WatchSource:0}: Error finding container 75d9db3196ab9b05f3565c88cfc2d7fb6c0d33d9cca401ffa712e47dfb748855: Status 404 returned error can't find the container with id 75d9db3196ab9b05f3565c88cfc2d7fb6c0d33d9cca401ffa712e47dfb748855 Nov 27 17:01:01 crc kubenswrapper[4954]: I1127 17:01:01.298193 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:01 crc kubenswrapper[4954]: I1127 17:01:01.299114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3","Type":"ContainerStarted","Data":"d7c970eaeaca7a9ec12687317a6a7c7f65dd42ab8cc54ff357d6d0e030ff275c"} Nov 27 17:01:01 crc kubenswrapper[4954]: I1127 17:01:01.302869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-h5mc2" event={"ID":"fa450761-82d0-4005-aee7-bcb56c03a5fd","Type":"ContainerStarted","Data":"49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4"} Nov 27 17:01:01 crc kubenswrapper[4954]: I1127 17:01:01.332722 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.53128655 podStartE2EDuration="35.332701844s" podCreationTimestamp="2025-11-27 17:00:26 +0000 UTC" firstStartedPulling="2025-11-27 17:00:27.933359633 +0000 UTC m=+1339.950799933" lastFinishedPulling="2025-11-27 17:01:00.734774927 +0000 UTC m=+1372.752215227" observedRunningTime="2025-11-27 17:01:01.316638324 +0000 UTC m=+1373.334078624" watchObservedRunningTime="2025-11-27 17:01:01.332701844 +0000 UTC m=+1373.350142144" Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.323387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerStarted","Data":"fc0d222ace3e15267e435d26b30e0757f5eb01878f94c36868883abc4afe2bfd"} Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.324027 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerStarted","Data":"75d9db3196ab9b05f3565c88cfc2d7fb6c0d33d9cca401ffa712e47dfb748855"} Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.325641 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-h5mc2" event={"ID":"fa450761-82d0-4005-aee7-bcb56c03a5fd","Type":"ContainerStarted","Data":"c05c69b862c3595200f0daefda719fb754972797125ae0e02b9a59d9ca5a19a9"} Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.344147 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29404381-h5mc2" podStartSLOduration=2.344116623 podStartE2EDuration="2.344116623s" podCreationTimestamp="2025-11-27 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:02.341690063 +0000 UTC m=+1374.359130363" watchObservedRunningTime="2025-11-27 17:01:02.344116623 +0000 UTC m=+1374.361556913" Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.430139 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.975752 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.976300 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-log" containerID="cri-o://61de4b1700431f4f877eee1d3c201d33e230900ebdaea173f19e402aedb7df6f" gracePeriod=30 Nov 27 17:01:02 crc kubenswrapper[4954]: I1127 17:01:02.976397 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-httpd" containerID="cri-o://8c2b287a97e7a4c254ce5883c5b166be57a3c4746fa5c5e10345750028c6dc33" gracePeriod=30 Nov 27 17:01:03 crc kubenswrapper[4954]: I1127 17:01:03.336340 4954 generic.go:334] "Generic (PLEG): container finished" podID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerID="61de4b1700431f4f877eee1d3c201d33e230900ebdaea173f19e402aedb7df6f" exitCode=143 Nov 27 17:01:03 crc kubenswrapper[4954]: I1127 17:01:03.336448 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerDied","Data":"61de4b1700431f4f877eee1d3c201d33e230900ebdaea173f19e402aedb7df6f"} Nov 27 17:01:03 crc kubenswrapper[4954]: I1127 17:01:03.339256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerStarted","Data":"80ef7e6c4072a41ac509b4180af504fd538ae9a77d325e0680e876782611f7f8"} Nov 27 17:01:03 crc kubenswrapper[4954]: I1127 17:01:03.947791 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5dndx"] Nov 27 17:01:03 crc kubenswrapper[4954]: I1127 17:01:03.949476 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.007172 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5dndx"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.039363 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbbc\" (UniqueName: \"kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.039448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.066756 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-h4kmm"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.068405 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.078094 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h4kmm"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.088180 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5e54-account-create-update-x5fjh"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.089466 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.091540 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.097738 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5e54-account-create-update-x5fjh"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.140921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.140974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzphv\" (UniqueName: \"kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.141045 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.141121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f4hr\" (UniqueName: \"kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.141144 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbbc\" (UniqueName: \"kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.141176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.142660 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.168647 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbbc\" (UniqueName: \"kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc\") pod \"nova-api-db-create-5dndx\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.243077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.243461 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f4hr\" (UniqueName: \"kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.243800 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.243822 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.243934 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzphv\" (UniqueName: \"kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.244660 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.254409 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kn544"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.255552 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.277963 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kn544"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.282480 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f4hr\" (UniqueName: \"kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr\") pod \"nova-api-5e54-account-create-update-x5fjh\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.300135 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzphv\" (UniqueName: \"kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv\") pod \"nova-cell0-db-create-h4kmm\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.300411 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7dca-account-create-update-jwm8d"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.301635 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.303913 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.313840 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7dca-account-create-update-jwm8d"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.320832 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.321078 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-log" containerID="cri-o://0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22" gracePeriod=30 Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.321256 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-httpd" containerID="cri-o://64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015" gracePeriod=30 Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.321837 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.349678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznrp\" (UniqueName: \"kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.349829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.349875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.349960 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bskkv\" (UniqueName: \"kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.366196 4954 generic.go:334] "Generic (PLEG): container finished" podID="fa450761-82d0-4005-aee7-bcb56c03a5fd" containerID="c05c69b862c3595200f0daefda719fb754972797125ae0e02b9a59d9ca5a19a9" exitCode=0 Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.366261 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-h5mc2" event={"ID":"fa450761-82d0-4005-aee7-bcb56c03a5fd","Type":"ContainerDied","Data":"c05c69b862c3595200f0daefda719fb754972797125ae0e02b9a59d9ca5a19a9"} Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.381440 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerStarted","Data":"99537b9b06f7e0f99005c529ea7be378ad8b52b0cab83f49b9bad645e4d59705"} Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.384151 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.405358 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.452456 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznrp\" (UniqueName: \"kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.452555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.452613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.452657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bskkv\" (UniqueName: \"kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.454852 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.455961 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.480358 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4663-account-create-update-5ghgg"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.482246 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.484461 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.489522 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznrp\" (UniqueName: \"kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp\") pod \"nova-cell0-7dca-account-create-update-jwm8d\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.493464 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bskkv\" (UniqueName: \"kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv\") pod \"nova-cell1-db-create-kn544\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.506430 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4663-account-create-update-5ghgg"] Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.524693 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.562017 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dgw7\" (UniqueName: \"kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.562094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.575056 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.663659 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dgw7\" (UniqueName: \"kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.663708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.705034 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.727290 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dgw7\" (UniqueName: \"kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7\") pod \"nova-cell1-4663-account-create-update-5ghgg\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.861859 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:04 crc kubenswrapper[4954]: I1127 17:01:04.943262 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5dndx"] Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.083457 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h4kmm"] Nov 27 17:01:05 crc kubenswrapper[4954]: W1127 17:01:05.089604 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3caed139_7f27_4afa_b159_ba85dc64bd91.slice/crio-d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d WatchSource:0}: Error finding container d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d: Status 404 returned error can't find the container with id d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d Nov 27 17:01:05 crc kubenswrapper[4954]: W1127 17:01:05.091665 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod750c1d74_a850_4e62_9680_cd65e44a254c.slice/crio-5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab WatchSource:0}: Error finding container 5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab: Status 404 returned error can't find the container with id 5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.098689 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5e54-account-create-update-x5fjh"] Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.291567 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7dca-account-create-update-jwm8d"] Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.302800 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kn544"] Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.397263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h4kmm" event={"ID":"3caed139-7f27-4afa-b159-ba85dc64bd91","Type":"ContainerStarted","Data":"d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.398386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kn544" event={"ID":"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177","Type":"ContainerStarted","Data":"4d0d8bbbbf4997c651f6d1e947c9ee4a9093e38d653bfd8bc752a6dd5d102629"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.400444 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5e54-account-create-update-x5fjh" event={"ID":"750c1d74-a850-4e62-9680-cd65e44a254c","Type":"ContainerStarted","Data":"5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.423325 4954 generic.go:334] "Generic (PLEG): container finished" podID="0db2964c-faef-4154-b502-1231f6762e37" containerID="0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22" exitCode=143 Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.423419 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerDied","Data":"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.426447 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" event={"ID":"81fb41af-f5ea-444d-aea7-a9b50124e2b4","Type":"ContainerStarted","Data":"6ed26a78fd475f9eaf67253ca13458e6049575575859d4c44d465dbc15e1baa8"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.427989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5dndx" event={"ID":"7c040041-36d3-4ba0-b7c4-5164dee45115","Type":"ContainerStarted","Data":"bd1d4a848e6006f105b3e13575985171c028af2dbffacf23bb139996cb1a193d"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.428011 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5dndx" event={"ID":"7c040041-36d3-4ba0-b7c4-5164dee45115","Type":"ContainerStarted","Data":"f18895504ceba503acea3aed816d97e5f7a1bafcb5e38a8abd1541f79abd5fcf"} Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.445838 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-5dndx" podStartSLOduration=2.445806526 podStartE2EDuration="2.445806526s" podCreationTimestamp="2025-11-27 17:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:05.444752671 +0000 UTC m=+1377.462192991" watchObservedRunningTime="2025-11-27 17:01:05.445806526 +0000 UTC m=+1377.463246826" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.586488 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4663-account-create-update-5ghgg"] Nov 27 17:01:05 crc kubenswrapper[4954]: W1127 17:01:05.616253 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53dbb9f3_5011_4342_a4df_bcfbe5991cbf.slice/crio-e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed WatchSource:0}: Error finding container e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed: Status 404 returned error can't find the container with id e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.829205 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.888291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data\") pod \"fa450761-82d0-4005-aee7-bcb56c03a5fd\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.888465 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys\") pod \"fa450761-82d0-4005-aee7-bcb56c03a5fd\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.888510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcf7\" (UniqueName: \"kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7\") pod \"fa450761-82d0-4005-aee7-bcb56c03a5fd\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.888615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle\") pod \"fa450761-82d0-4005-aee7-bcb56c03a5fd\" (UID: \"fa450761-82d0-4005-aee7-bcb56c03a5fd\") " Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.898698 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7" (OuterVolumeSpecName: "kube-api-access-sfcf7") pod "fa450761-82d0-4005-aee7-bcb56c03a5fd" (UID: "fa450761-82d0-4005-aee7-bcb56c03a5fd"). InnerVolumeSpecName "kube-api-access-sfcf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.899049 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa450761-82d0-4005-aee7-bcb56c03a5fd" (UID: "fa450761-82d0-4005-aee7-bcb56c03a5fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.930005 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa450761-82d0-4005-aee7-bcb56c03a5fd" (UID: "fa450761-82d0-4005-aee7-bcb56c03a5fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.954796 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data" (OuterVolumeSpecName: "config-data") pod "fa450761-82d0-4005-aee7-bcb56c03a5fd" (UID: "fa450761-82d0-4005-aee7-bcb56c03a5fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.990644 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.990677 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.990686 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa450761-82d0-4005-aee7-bcb56c03a5fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:05 crc kubenswrapper[4954]: I1127 17:01:05.990695 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcf7\" (UniqueName: \"kubernetes.io/projected/fa450761-82d0-4005-aee7-bcb56c03a5fd-kube-api-access-sfcf7\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.443971 4954 generic.go:334] "Generic (PLEG): container finished" podID="53dbb9f3-5011-4342-a4df-bcfbe5991cbf" containerID="93bdb24f947514315bcf3e987ce9bdc590ff9c36adf2e0f8088bb5039237603f" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.444088 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" event={"ID":"53dbb9f3-5011-4342-a4df-bcfbe5991cbf","Type":"ContainerDied","Data":"93bdb24f947514315bcf3e987ce9bdc590ff9c36adf2e0f8088bb5039237603f"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.444122 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" event={"ID":"53dbb9f3-5011-4342-a4df-bcfbe5991cbf","Type":"ContainerStarted","Data":"e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452241 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-central-agent" containerID="cri-o://fc0d222ace3e15267e435d26b30e0757f5eb01878f94c36868883abc4afe2bfd" gracePeriod=30 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452350 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerStarted","Data":"3dd49f03e2c989f7eceadb71fb0a841a4e7a7df6bec540b365fede09ca234942"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452378 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="sg-core" containerID="cri-o://99537b9b06f7e0f99005c529ea7be378ad8b52b0cab83f49b9bad645e4d59705" gracePeriod=30 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452357 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="proxy-httpd" containerID="cri-o://3dd49f03e2c989f7eceadb71fb0a841a4e7a7df6bec540b365fede09ca234942" gracePeriod=30 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452494 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.452427 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-notification-agent" containerID="cri-o://80ef7e6c4072a41ac509b4180af504fd538ae9a77d325e0680e876782611f7f8" gracePeriod=30 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.459259 4954 generic.go:334] "Generic (PLEG): container finished" podID="7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" containerID="4ce23aac0304c81f40256141b44c41f821694dbae99e5ce9d8d082e60062581c" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.459413 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kn544" event={"ID":"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177","Type":"ContainerDied","Data":"4ce23aac0304c81f40256141b44c41f821694dbae99e5ce9d8d082e60062581c"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.469031 4954 generic.go:334] "Generic (PLEG): container finished" podID="750c1d74-a850-4e62-9680-cd65e44a254c" containerID="8b72a5b34aefc43205e6fe4ecb60c0b7db52ebe1820c50c915ac7d37e091f45b" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.469121 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5e54-account-create-update-x5fjh" event={"ID":"750c1d74-a850-4e62-9680-cd65e44a254c","Type":"ContainerDied","Data":"8b72a5b34aefc43205e6fe4ecb60c0b7db52ebe1820c50c915ac7d37e091f45b"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.495093 4954 generic.go:334] "Generic (PLEG): container finished" podID="81fb41af-f5ea-444d-aea7-a9b50124e2b4" containerID="8c825753331d40abf0426ceb20fe0cb284f245f8271eac4c33f5ac20c4c710c9" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.495273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" event={"ID":"81fb41af-f5ea-444d-aea7-a9b50124e2b4","Type":"ContainerDied","Data":"8c825753331d40abf0426ceb20fe0cb284f245f8271eac4c33f5ac20c4c710c9"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.501521 4954 generic.go:334] "Generic (PLEG): container finished" podID="3caed139-7f27-4afa-b159-ba85dc64bd91" containerID="3160d2b7a178b69dc55df9577811ca0bcbb9832b7fce48c4efd35afc6c0e7dfa" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.501690 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h4kmm" event={"ID":"3caed139-7f27-4afa-b159-ba85dc64bd91","Type":"ContainerDied","Data":"3160d2b7a178b69dc55df9577811ca0bcbb9832b7fce48c4efd35afc6c0e7dfa"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.513252 4954 generic.go:334] "Generic (PLEG): container finished" podID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerID="8c2b287a97e7a4c254ce5883c5b166be57a3c4746fa5c5e10345750028c6dc33" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.513327 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerDied","Data":"8c2b287a97e7a4c254ce5883c5b166be57a3c4746fa5c5e10345750028c6dc33"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.515382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-h5mc2" event={"ID":"fa450761-82d0-4005-aee7-bcb56c03a5fd","Type":"ContainerDied","Data":"49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.515408 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d7e8bb378d5950daf58a0e72bf83dc9162e1f32fb10d50693d6242cb8390a4" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.515486 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-h5mc2" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.517993 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2988415030000002 podStartE2EDuration="6.517970399s" podCreationTimestamp="2025-11-27 17:01:00 +0000 UTC" firstStartedPulling="2025-11-27 17:01:01.301370253 +0000 UTC m=+1373.318810553" lastFinishedPulling="2025-11-27 17:01:05.520499149 +0000 UTC m=+1377.537939449" observedRunningTime="2025-11-27 17:01:06.508408937 +0000 UTC m=+1378.525849237" watchObservedRunningTime="2025-11-27 17:01:06.517970399 +0000 UTC m=+1378.535410699" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.526680 4954 generic.go:334] "Generic (PLEG): container finished" podID="7c040041-36d3-4ba0-b7c4-5164dee45115" containerID="bd1d4a848e6006f105b3e13575985171c028af2dbffacf23bb139996cb1a193d" exitCode=0 Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.526738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5dndx" event={"ID":"7c040041-36d3-4ba0-b7c4-5164dee45115","Type":"ContainerDied","Data":"bd1d4a848e6006f105b3e13575985171c028af2dbffacf23bb139996cb1a193d"} Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.712876 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806034 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-547tq\" (UniqueName: \"kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806092 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806153 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806172 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806213 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806255 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806275 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.806326 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data\") pod \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\" (UID: \"631c9c91-60a4-48e3-aa9a-6333ae35bcf9\") " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.808439 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.808763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs" (OuterVolumeSpecName: "logs") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.817028 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.828008 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq" (OuterVolumeSpecName: "kube-api-access-547tq") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "kube-api-access-547tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.852746 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts" (OuterVolumeSpecName: "scripts") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.862807 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909297 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-547tq\" (UniqueName: \"kubernetes.io/projected/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-kube-api-access-547tq\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909334 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909344 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909370 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909380 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.909388 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.928998 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.942675 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4954]: I1127 17:01:06.967461 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data" (OuterVolumeSpecName: "config-data") pod "631c9c91-60a4-48e3-aa9a-6333ae35bcf9" (UID: "631c9c91-60a4-48e3-aa9a-6333ae35bcf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.011304 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.011334 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.011343 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631c9c91-60a4-48e3-aa9a-6333ae35bcf9-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.541756 4954 generic.go:334] "Generic (PLEG): container finished" podID="649d4e07-ae06-43ca-a97d-adaba927165c" containerID="3dd49f03e2c989f7eceadb71fb0a841a4e7a7df6bec540b365fede09ca234942" exitCode=0 Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.542932 4954 generic.go:334] "Generic (PLEG): container finished" podID="649d4e07-ae06-43ca-a97d-adaba927165c" containerID="99537b9b06f7e0f99005c529ea7be378ad8b52b0cab83f49b9bad645e4d59705" exitCode=2 Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.541832 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerDied","Data":"3dd49f03e2c989f7eceadb71fb0a841a4e7a7df6bec540b365fede09ca234942"} Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.543005 4954 generic.go:334] "Generic (PLEG): container finished" podID="649d4e07-ae06-43ca-a97d-adaba927165c" containerID="80ef7e6c4072a41ac509b4180af504fd538ae9a77d325e0680e876782611f7f8" exitCode=0 Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.543120 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerDied","Data":"99537b9b06f7e0f99005c529ea7be378ad8b52b0cab83f49b9bad645e4d59705"} Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.543280 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerDied","Data":"80ef7e6c4072a41ac509b4180af504fd538ae9a77d325e0680e876782611f7f8"} Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.545670 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"631c9c91-60a4-48e3-aa9a-6333ae35bcf9","Type":"ContainerDied","Data":"c7cdbb42a093519a5bee22c256db4e19c57cf8f292fc41fe8f3ea328d08e4a51"} Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.545733 4954 scope.go:117] "RemoveContainer" containerID="8c2b287a97e7a4c254ce5883c5b166be57a3c4746fa5c5e10345750028c6dc33" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.545954 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.700755 4954 scope.go:117] "RemoveContainer" containerID="61de4b1700431f4f877eee1d3c201d33e230900ebdaea173f19e402aedb7df6f" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.758632 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.785437 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.798679 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:07 crc kubenswrapper[4954]: E1127 17:01:07.799125 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-log" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799138 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-log" Nov 27 17:01:07 crc kubenswrapper[4954]: E1127 17:01:07.799164 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa450761-82d0-4005-aee7-bcb56c03a5fd" containerName="keystone-cron" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799172 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa450761-82d0-4005-aee7-bcb56c03a5fd" containerName="keystone-cron" Nov 27 17:01:07 crc kubenswrapper[4954]: E1127 17:01:07.799200 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-httpd" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799206 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-httpd" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799368 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa450761-82d0-4005-aee7-bcb56c03a5fd" containerName="keystone-cron" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799379 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-httpd" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.799406 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" containerName="glance-log" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.800406 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.803496 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.803702 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.816467 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.843989 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-logs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844055 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844101 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844131 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgf4t\" (UniqueName: \"kubernetes.io/projected/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-kube-api-access-tgf4t\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844226 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.844337 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.945968 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-logs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946013 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946042 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgf4t\" (UniqueName: \"kubernetes.io/projected/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-kube-api-access-tgf4t\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946151 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946192 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.946212 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.947090 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-logs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.947764 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.948080 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.956973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.959469 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.960719 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.963675 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:07 crc kubenswrapper[4954]: I1127 17:01:07.976336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgf4t\" (UniqueName: \"kubernetes.io/projected/1301cc13-44b9-4a6e-b82d-cbea335ebc9a-kube-api-access-tgf4t\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.037331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1301cc13-44b9-4a6e-b82d-cbea335ebc9a\") " pod="openstack/glance-default-external-api-0" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.105998 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.120575 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.169763 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts\") pod \"3caed139-7f27-4afa-b159-ba85dc64bd91\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.169925 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzphv\" (UniqueName: \"kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv\") pod \"3caed139-7f27-4afa-b159-ba85dc64bd91\" (UID: \"3caed139-7f27-4afa-b159-ba85dc64bd91\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.179129 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3caed139-7f27-4afa-b159-ba85dc64bd91" (UID: "3caed139-7f27-4afa-b159-ba85dc64bd91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.185426 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv" (OuterVolumeSpecName: "kube-api-access-jzphv") pod "3caed139-7f27-4afa-b159-ba85dc64bd91" (UID: "3caed139-7f27-4afa-b159-ba85dc64bd91"). InnerVolumeSpecName "kube-api-access-jzphv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.273013 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3caed139-7f27-4afa-b159-ba85dc64bd91-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.273052 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzphv\" (UniqueName: \"kubernetes.io/projected/3caed139-7f27-4afa-b159-ba85dc64bd91-kube-api-access-jzphv\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.372521 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.405820 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.406362 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.422792 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.438881 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.481760 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f4hr\" (UniqueName: \"kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr\") pod \"750c1d74-a850-4e62-9680-cd65e44a254c\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.481815 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts\") pod \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.481899 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts\") pod \"750c1d74-a850-4e62-9680-cd65e44a254c\" (UID: \"750c1d74-a850-4e62-9680-cd65e44a254c\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hznrp\" (UniqueName: \"kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp\") pod \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482079 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjbbc\" (UniqueName: \"kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc\") pod \"7c040041-36d3-4ba0-b7c4-5164dee45115\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482256 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts\") pod \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts\") pod \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\" (UID: \"81fb41af-f5ea-444d-aea7-a9b50124e2b4\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482308 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts\") pod \"7c040041-36d3-4ba0-b7c4-5164dee45115\" (UID: \"7c040041-36d3-4ba0-b7c4-5164dee45115\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482332 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bskkv\" (UniqueName: \"kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv\") pod \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\" (UID: \"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482333 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53dbb9f3-5011-4342-a4df-bcfbe5991cbf" (UID: "53dbb9f3-5011-4342-a4df-bcfbe5991cbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.482348 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dgw7\" (UniqueName: \"kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7\") pod \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\" (UID: \"53dbb9f3-5011-4342-a4df-bcfbe5991cbf\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.483845 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c040041-36d3-4ba0-b7c4-5164dee45115" (UID: "7c040041-36d3-4ba0-b7c4-5164dee45115"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.483862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "750c1d74-a850-4e62-9680-cd65e44a254c" (UID: "750c1d74-a850-4e62-9680-cd65e44a254c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.484483 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" (UID: "7a8f7cd7-b71b-4fa5-a4fa-83a528b78177"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.484796 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c040041-36d3-4ba0-b7c4-5164dee45115-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.484812 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.484820 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/750c1d74-a850-4e62-9680-cd65e44a254c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.484829 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.486765 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81fb41af-f5ea-444d-aea7-a9b50124e2b4" (UID: "81fb41af-f5ea-444d-aea7-a9b50124e2b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.488898 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr" (OuterVolumeSpecName: "kube-api-access-8f4hr") pod "750c1d74-a850-4e62-9680-cd65e44a254c" (UID: "750c1d74-a850-4e62-9680-cd65e44a254c"). InnerVolumeSpecName "kube-api-access-8f4hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.490956 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7" (OuterVolumeSpecName: "kube-api-access-6dgw7") pod "53dbb9f3-5011-4342-a4df-bcfbe5991cbf" (UID: "53dbb9f3-5011-4342-a4df-bcfbe5991cbf"). InnerVolumeSpecName "kube-api-access-6dgw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.494308 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc" (OuterVolumeSpecName: "kube-api-access-fjbbc") pod "7c040041-36d3-4ba0-b7c4-5164dee45115" (UID: "7c040041-36d3-4ba0-b7c4-5164dee45115"). InnerVolumeSpecName "kube-api-access-fjbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.497002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp" (OuterVolumeSpecName: "kube-api-access-hznrp") pod "81fb41af-f5ea-444d-aea7-a9b50124e2b4" (UID: "81fb41af-f5ea-444d-aea7-a9b50124e2b4"). InnerVolumeSpecName "kube-api-access-hznrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.497824 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv" (OuterVolumeSpecName: "kube-api-access-bskkv") pod "7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" (UID: "7a8f7cd7-b71b-4fa5-a4fa-83a528b78177"). InnerVolumeSpecName "kube-api-access-bskkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.561252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5dndx" event={"ID":"7c040041-36d3-4ba0-b7c4-5164dee45115","Type":"ContainerDied","Data":"f18895504ceba503acea3aed816d97e5f7a1bafcb5e38a8abd1541f79abd5fcf"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.561294 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f18895504ceba503acea3aed816d97e5f7a1bafcb5e38a8abd1541f79abd5fcf" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.561349 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5dndx" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.579833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h4kmm" event={"ID":"3caed139-7f27-4afa-b159-ba85dc64bd91","Type":"ContainerDied","Data":"d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.579870 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d76830d0617896df80c9d9e2d12ec57a5cd16ab93d7e6988ddd6a413d497282d" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.580230 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h4kmm" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.582210 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.583799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kn544" event={"ID":"7a8f7cd7-b71b-4fa5-a4fa-83a528b78177","Type":"ContainerDied","Data":"4d0d8bbbbf4997c651f6d1e947c9ee4a9093e38d653bfd8bc752a6dd5d102629"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.583838 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0d8bbbbf4997c651f6d1e947c9ee4a9093e38d653bfd8bc752a6dd5d102629" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.583895 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kn544" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585875 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bskkv\" (UniqueName: \"kubernetes.io/projected/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177-kube-api-access-bskkv\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585900 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dgw7\" (UniqueName: \"kubernetes.io/projected/53dbb9f3-5011-4342-a4df-bcfbe5991cbf-kube-api-access-6dgw7\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585911 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f4hr\" (UniqueName: \"kubernetes.io/projected/750c1d74-a850-4e62-9680-cd65e44a254c-kube-api-access-8f4hr\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585921 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hznrp\" (UniqueName: \"kubernetes.io/projected/81fb41af-f5ea-444d-aea7-a9b50124e2b4-kube-api-access-hznrp\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585930 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjbbc\" (UniqueName: \"kubernetes.io/projected/7c040041-36d3-4ba0-b7c4-5164dee45115-kube-api-access-fjbbc\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.585939 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81fb41af-f5ea-444d-aea7-a9b50124e2b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.586210 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5e54-account-create-update-x5fjh" event={"ID":"750c1d74-a850-4e62-9680-cd65e44a254c","Type":"ContainerDied","Data":"5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.586253 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3786ff7d59a67ca0c1fd1e232b653f8f6c18fe8d91ad34ea3c2e8937482eab" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.586322 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5e54-account-create-update-x5fjh" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.594896 4954 generic.go:334] "Generic (PLEG): container finished" podID="0db2964c-faef-4154-b502-1231f6762e37" containerID="64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015" exitCode=0 Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.594984 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerDied","Data":"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.595001 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.595022 4954 scope.go:117] "RemoveContainer" containerID="64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.595011 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0db2964c-faef-4154-b502-1231f6762e37","Type":"ContainerDied","Data":"89817459e4827cf5bb3e3f4f3fe112bb8c315811261e793a3eb92bf884d3fdef"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.598238 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" event={"ID":"81fb41af-f5ea-444d-aea7-a9b50124e2b4","Type":"ContainerDied","Data":"6ed26a78fd475f9eaf67253ca13458e6049575575859d4c44d465dbc15e1baa8"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.598268 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ed26a78fd475f9eaf67253ca13458e6049575575859d4c44d465dbc15e1baa8" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.599250 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dca-account-create-update-jwm8d" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.611134 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" event={"ID":"53dbb9f3-5011-4342-a4df-bcfbe5991cbf","Type":"ContainerDied","Data":"e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed"} Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.611178 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51dd31e7dac5b73b9a1cd4cd95e1401639db6fd1e54bd36b0a57752d0f3c5ed" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.611243 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4663-account-create-update-5ghgg" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.686569 4954 scope.go:117] "RemoveContainer" containerID="0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687144 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687288 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9wk5\" (UniqueName: \"kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687342 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687446 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687502 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.687837 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs\") pod \"0db2964c-faef-4154-b502-1231f6762e37\" (UID: \"0db2964c-faef-4154-b502-1231f6762e37\") " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.690926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs" (OuterVolumeSpecName: "logs") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.690967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.703464 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts" (OuterVolumeSpecName: "scripts") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.705127 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5" (OuterVolumeSpecName: "kube-api-access-f9wk5") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "kube-api-access-f9wk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.709699 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.743296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.744872 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.755873 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data" (OuterVolumeSpecName: "config-data") pod "0db2964c-faef-4154-b502-1231f6762e37" (UID: "0db2964c-faef-4154-b502-1231f6762e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792539 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9wk5\" (UniqueName: \"kubernetes.io/projected/0db2964c-faef-4154-b502-1231f6762e37-kube-api-access-f9wk5\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792574 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792599 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792619 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792633 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0db2964c-faef-4154-b502-1231f6762e37-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792646 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792657 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.792666 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db2964c-faef-4154-b502-1231f6762e37-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.793533 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631c9c91-60a4-48e3-aa9a-6333ae35bcf9" path="/var/lib/kubelet/pods/631c9c91-60a4-48e3-aa9a-6333ae35bcf9/volumes" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.795417 4954 scope.go:117] "RemoveContainer" containerID="64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015" Nov 27 17:01:08 crc kubenswrapper[4954]: E1127 17:01:08.796079 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015\": container with ID starting with 64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015 not found: ID does not exist" containerID="64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.796144 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015"} err="failed to get container status \"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015\": rpc error: code = NotFound desc = could not find container \"64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015\": container with ID starting with 64362aa36b1fa29fc2a7979add106067232b5d5cd48dd9dcd3d2293580c21015 not found: ID does not exist" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.796173 4954 scope.go:117] "RemoveContainer" containerID="0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22" Nov 27 17:01:08 crc kubenswrapper[4954]: E1127 17:01:08.797737 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22\": container with ID starting with 0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22 not found: ID does not exist" containerID="0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.797781 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22"} err="failed to get container status \"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22\": rpc error: code = NotFound desc = could not find container \"0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22\": container with ID starting with 0ab6aee1db3e5fab4616639290f436d8737ec84f6f9b45031bceb1eb2bd54c22 not found: ID does not exist" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.802020 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.827838 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.894929 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:08 crc kubenswrapper[4954]: I1127 17:01:08.971000 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.008534 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.022814 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023306 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c1d74-a850-4e62-9680-cd65e44a254c" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023329 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c1d74-a850-4e62-9680-cd65e44a254c" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023343 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-httpd" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023351 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-httpd" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023366 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dbb9f3-5011-4342-a4df-bcfbe5991cbf" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023373 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dbb9f3-5011-4342-a4df-bcfbe5991cbf" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023397 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c040041-36d3-4ba0-b7c4-5164dee45115" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023405 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c040041-36d3-4ba0-b7c4-5164dee45115" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023422 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-log" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023431 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-log" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023449 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023457 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023476 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3caed139-7f27-4afa-b159-ba85dc64bd91" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023483 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3caed139-7f27-4afa-b159-ba85dc64bd91" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: E1127 17:01:09.023501 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fb41af-f5ea-444d-aea7-a9b50124e2b4" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023510 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fb41af-f5ea-444d-aea7-a9b50124e2b4" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023744 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fb41af-f5ea-444d-aea7-a9b50124e2b4" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023763 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="750c1d74-a850-4e62-9680-cd65e44a254c" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023784 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-httpd" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023827 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dbb9f3-5011-4342-a4df-bcfbe5991cbf" containerName="mariadb-account-create-update" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023845 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023864 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db2964c-faef-4154-b502-1231f6762e37" containerName="glance-log" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023883 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c040041-36d3-4ba0-b7c4-5164dee45115" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.023892 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3caed139-7f27-4afa-b159-ba85dc64bd91" containerName="mariadb-database-create" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.025074 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.028272 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.028495 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.036649 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109183 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109219 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhx7\" (UniqueName: \"kubernetes.io/projected/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-kube-api-access-5dhx7\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109255 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109281 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109327 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.109356 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210807 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210838 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210872 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.210973 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.211000 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhx7\" (UniqueName: \"kubernetes.io/projected/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-kube-api-access-5dhx7\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.212304 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.212457 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.219762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.222511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.224427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.224669 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.227245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.233275 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhx7\" (UniqueName: \"kubernetes.io/projected/0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07-kube-api-access-5dhx7\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.250419 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07\") " pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.346214 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.633178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1301cc13-44b9-4a6e-b82d-cbea335ebc9a","Type":"ContainerStarted","Data":"324b3d4a1cf413bb3d9bee728a30cef086c23f09c66f81aeb88f0ad1a8991494"} Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.633542 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1301cc13-44b9-4a6e-b82d-cbea335ebc9a","Type":"ContainerStarted","Data":"e480bac59f8d1cf417a75feb133ee0deeef4fae46eeb7d90c52660402d879423"} Nov 27 17:01:09 crc kubenswrapper[4954]: I1127 17:01:09.901776 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 17:01:10 crc kubenswrapper[4954]: I1127 17:01:10.656118 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07","Type":"ContainerStarted","Data":"1d5256e7067286a3d48d87da5128a8dabdb8cfca050c3480342045741e83968d"} Nov 27 17:01:10 crc kubenswrapper[4954]: I1127 17:01:10.656617 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07","Type":"ContainerStarted","Data":"63790f9d2b311dc5dbf55e046c1f6bb3bd1915959772517e17f88db15788f071"} Nov 27 17:01:10 crc kubenswrapper[4954]: I1127 17:01:10.658786 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1301cc13-44b9-4a6e-b82d-cbea335ebc9a","Type":"ContainerStarted","Data":"cf4e01715ab8c3caa6e5576b0d254e76c631e7f6ca78ebffadffd55c995bead8"} Nov 27 17:01:10 crc kubenswrapper[4954]: I1127 17:01:10.679631 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db2964c-faef-4154-b502-1231f6762e37" path="/var/lib/kubelet/pods/0db2964c-faef-4154-b502-1231f6762e37/volumes" Nov 27 17:01:10 crc kubenswrapper[4954]: I1127 17:01:10.687940 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.687920741 podStartE2EDuration="3.687920741s" podCreationTimestamp="2025-11-27 17:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:10.684609051 +0000 UTC m=+1382.702049371" watchObservedRunningTime="2025-11-27 17:01:10.687920741 +0000 UTC m=+1382.705361041" Nov 27 17:01:11 crc kubenswrapper[4954]: I1127 17:01:11.672006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07","Type":"ContainerStarted","Data":"fb05df32c2b99a4e90e5934d7b515698e032437290aa3b93214f4b9f8f03e9d7"} Nov 27 17:01:11 crc kubenswrapper[4954]: I1127 17:01:11.730498 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.7304788760000003 podStartE2EDuration="3.730478876s" podCreationTimestamp="2025-11-27 17:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:11.68904291 +0000 UTC m=+1383.706483220" watchObservedRunningTime="2025-11-27 17:01:11.730478876 +0000 UTC m=+1383.747919166" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.693234 4954 generic.go:334] "Generic (PLEG): container finished" podID="649d4e07-ae06-43ca-a97d-adaba927165c" containerID="fc0d222ace3e15267e435d26b30e0757f5eb01878f94c36868883abc4afe2bfd" exitCode=0 Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.693496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerDied","Data":"fc0d222ace3e15267e435d26b30e0757f5eb01878f94c36868883abc4afe2bfd"} Nov 27 17:01:13 crc kubenswrapper[4954]: E1127 17:01:13.708427 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.828200 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905111 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905155 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905197 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905314 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f8fn\" (UniqueName: \"kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905362 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd\") pod \"649d4e07-ae06-43ca-a97d-adaba927165c\" (UID: \"649d4e07-ae06-43ca-a97d-adaba927165c\") " Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905901 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.905999 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.910471 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts" (OuterVolumeSpecName: "scripts") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.919893 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn" (OuterVolumeSpecName: "kube-api-access-6f8fn") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "kube-api-access-6f8fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.931239 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.968780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:13 crc kubenswrapper[4954]: I1127 17:01:13.992714 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data" (OuterVolumeSpecName: "config-data") pod "649d4e07-ae06-43ca-a97d-adaba927165c" (UID: "649d4e07-ae06-43ca-a97d-adaba927165c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007012 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007046 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007056 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007065 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007074 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649d4e07-ae06-43ca-a97d-adaba927165c-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007082 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649d4e07-ae06-43ca-a97d-adaba927165c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.007090 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f8fn\" (UniqueName: \"kubernetes.io/projected/649d4e07-ae06-43ca-a97d-adaba927165c-kube-api-access-6f8fn\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.104152 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="17240c1b-4f70-4182-9b68-dac293e719ef" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.595889 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zsv66"] Nov 27 17:01:14 crc kubenswrapper[4954]: E1127 17:01:14.597811 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-notification-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.597840 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-notification-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: E1127 17:01:14.597866 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-central-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.597875 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-central-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: E1127 17:01:14.597900 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="proxy-httpd" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.597908 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="proxy-httpd" Nov 27 17:01:14 crc kubenswrapper[4954]: E1127 17:01:14.597932 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="sg-core" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.597939 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="sg-core" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.598197 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="proxy-httpd" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.598220 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-central-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.598231 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="sg-core" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.598242 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" containerName="ceilometer-notification-agent" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.598971 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.600816 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.601332 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t2xkq" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.603421 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.613917 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zsv66"] Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.614994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4gq\" (UniqueName: \"kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.615100 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.615158 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.615191 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.707087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649d4e07-ae06-43ca-a97d-adaba927165c","Type":"ContainerDied","Data":"75d9db3196ab9b05f3565c88cfc2d7fb6c0d33d9cca401ffa712e47dfb748855"} Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.707440 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.707771 4954 scope.go:117] "RemoveContainer" containerID="3dd49f03e2c989f7eceadb71fb0a841a4e7a7df6bec540b365fede09ca234942" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.716932 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.717086 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4gq\" (UniqueName: \"kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.717130 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.717184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.722168 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.722316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.729518 4954 scope.go:117] "RemoveContainer" containerID="99537b9b06f7e0f99005c529ea7be378ad8b52b0cab83f49b9bad645e4d59705" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.744323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.750827 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4gq\" (UniqueName: \"kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq\") pod \"nova-cell0-conductor-db-sync-zsv66\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.779540 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.793650 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.802489 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.804861 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.807565 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.807839 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.813711 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.823875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.827854 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.828147 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.828338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w226j\" (UniqueName: \"kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.828475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.828679 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.828801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.830509 4954 scope.go:117] "RemoveContainer" containerID="80ef7e6c4072a41ac509b4180af504fd538ae9a77d325e0680e876782611f7f8" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.856260 4954 scope.go:117] "RemoveContainer" containerID="fc0d222ace3e15267e435d26b30e0757f5eb01878f94c36868883abc4afe2bfd" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.918685 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930354 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930385 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930425 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.930496 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w226j\" (UniqueName: \"kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.933213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.934356 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.935804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.936466 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.936596 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.938083 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:14 crc kubenswrapper[4954]: I1127 17:01:14.948464 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w226j\" (UniqueName: \"kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j\") pod \"ceilometer-0\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " pod="openstack/ceilometer-0" Nov 27 17:01:15 crc kubenswrapper[4954]: I1127 17:01:15.133408 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:15 crc kubenswrapper[4954]: I1127 17:01:15.414801 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zsv66"] Nov 27 17:01:15 crc kubenswrapper[4954]: W1127 17:01:15.418475 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef92b7c_4671_4ac0_9a4e_f76233eb4c8e.slice/crio-a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134 WatchSource:0}: Error finding container a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134: Status 404 returned error can't find the container with id a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134 Nov 27 17:01:15 crc kubenswrapper[4954]: I1127 17:01:15.609993 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:15 crc kubenswrapper[4954]: I1127 17:01:15.716665 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerStarted","Data":"dc2fbe873a1ddfc3ae6776dc53ce5eb58f9087bf3d28a560b70877a6369b7130"} Nov 27 17:01:15 crc kubenswrapper[4954]: I1127 17:01:15.720098 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zsv66" event={"ID":"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e","Type":"ContainerStarted","Data":"a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134"} Nov 27 17:01:16 crc kubenswrapper[4954]: I1127 17:01:16.673980 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649d4e07-ae06-43ca-a97d-adaba927165c" path="/var/lib/kubelet/pods/649d4e07-ae06-43ca-a97d-adaba927165c/volumes" Nov 27 17:01:16 crc kubenswrapper[4954]: I1127 17:01:16.730166 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerStarted","Data":"6d25f8ca7e86613d4319938e2d5ae3dca903f8fc0fa02e8c4838e1c850d48759"} Nov 27 17:01:17 crc kubenswrapper[4954]: I1127 17:01:17.743242 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerStarted","Data":"bc4691d97f7d1704035cac745f8c7650452ac5e77af6b2749f937c1ba82dafab"} Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.133008 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.133513 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.185075 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.199354 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.752426 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 17:01:18 crc kubenswrapper[4954]: I1127 17:01:18.752471 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.346861 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.347458 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.393908 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.397863 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.762789 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:19 crc kubenswrapper[4954]: I1127 17:01:19.762844 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:20 crc kubenswrapper[4954]: I1127 17:01:20.772841 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 17:01:20 crc kubenswrapper[4954]: I1127 17:01:20.772992 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 17:01:20 crc kubenswrapper[4954]: I1127 17:01:20.869521 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 17:01:21 crc kubenswrapper[4954]: I1127 17:01:21.782805 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 17:01:21 crc kubenswrapper[4954]: I1127 17:01:21.782839 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 17:01:21 crc kubenswrapper[4954]: I1127 17:01:21.932814 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:22 crc kubenswrapper[4954]: I1127 17:01:22.165738 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 17:01:23 crc kubenswrapper[4954]: I1127 17:01:23.687347 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:01:23 crc kubenswrapper[4954]: I1127 17:01:23.687914 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:01:23 crc kubenswrapper[4954]: E1127 17:01:23.940069 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:01:24 crc kubenswrapper[4954]: I1127 17:01:24.540237 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:25 crc kubenswrapper[4954]: I1127 17:01:25.821555 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zsv66" event={"ID":"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e","Type":"ContainerStarted","Data":"8d8e3a9508023c2041d5b56d00f6ecb39ac3bfe4529a587f6bd42d04185d2f05"} Nov 27 17:01:25 crc kubenswrapper[4954]: I1127 17:01:25.825598 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerStarted","Data":"6fc73ffa866ae248f9047aecc39f1937daf3fb9aed187a9fbe0a0105e4ebf762"} Nov 27 17:01:25 crc kubenswrapper[4954]: I1127 17:01:25.851350 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zsv66" podStartSLOduration=1.7447945009999999 podStartE2EDuration="11.851323378s" podCreationTimestamp="2025-11-27 17:01:14 +0000 UTC" firstStartedPulling="2025-11-27 17:01:15.420452853 +0000 UTC m=+1387.437893143" lastFinishedPulling="2025-11-27 17:01:25.52698171 +0000 UTC m=+1397.544422020" observedRunningTime="2025-11-27 17:01:25.837849102 +0000 UTC m=+1397.855289432" watchObservedRunningTime="2025-11-27 17:01:25.851323378 +0000 UTC m=+1397.868763678" Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847197 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerStarted","Data":"00390434b1ae9ea92847994662ab149f7120796038c73bd89091184c9a19bfd2"} Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847810 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847401 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="proxy-httpd" containerID="cri-o://00390434b1ae9ea92847994662ab149f7120796038c73bd89091184c9a19bfd2" gracePeriod=30 Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847336 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-central-agent" containerID="cri-o://6d25f8ca7e86613d4319938e2d5ae3dca903f8fc0fa02e8c4838e1c850d48759" gracePeriod=30 Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847456 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-notification-agent" containerID="cri-o://bc4691d97f7d1704035cac745f8c7650452ac5e77af6b2749f937c1ba82dafab" gracePeriod=30 Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.847448 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="sg-core" containerID="cri-o://6fc73ffa866ae248f9047aecc39f1937daf3fb9aed187a9fbe0a0105e4ebf762" gracePeriod=30 Nov 27 17:01:27 crc kubenswrapper[4954]: I1127 17:01:27.873946 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.286074295 podStartE2EDuration="13.873925961s" podCreationTimestamp="2025-11-27 17:01:14 +0000 UTC" firstStartedPulling="2025-11-27 17:01:15.632512348 +0000 UTC m=+1387.649952648" lastFinishedPulling="2025-11-27 17:01:27.220364014 +0000 UTC m=+1399.237804314" observedRunningTime="2025-11-27 17:01:27.872766173 +0000 UTC m=+1399.890206473" watchObservedRunningTime="2025-11-27 17:01:27.873925961 +0000 UTC m=+1399.891366261" Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.867915 4954 generic.go:334] "Generic (PLEG): container finished" podID="84b84d83-c981-4800-b508-f856f5abd02b" containerID="00390434b1ae9ea92847994662ab149f7120796038c73bd89091184c9a19bfd2" exitCode=0 Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.867973 4954 generic.go:334] "Generic (PLEG): container finished" podID="84b84d83-c981-4800-b508-f856f5abd02b" containerID="6fc73ffa866ae248f9047aecc39f1937daf3fb9aed187a9fbe0a0105e4ebf762" exitCode=2 Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.867994 4954 generic.go:334] "Generic (PLEG): container finished" podID="84b84d83-c981-4800-b508-f856f5abd02b" containerID="6d25f8ca7e86613d4319938e2d5ae3dca903f8fc0fa02e8c4838e1c850d48759" exitCode=0 Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.868005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerDied","Data":"00390434b1ae9ea92847994662ab149f7120796038c73bd89091184c9a19bfd2"} Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.868059 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerDied","Data":"6fc73ffa866ae248f9047aecc39f1937daf3fb9aed187a9fbe0a0105e4ebf762"} Nov 27 17:01:28 crc kubenswrapper[4954]: I1127 17:01:28.868079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerDied","Data":"6d25f8ca7e86613d4319938e2d5ae3dca903f8fc0fa02e8c4838e1c850d48759"} Nov 27 17:01:29 crc kubenswrapper[4954]: I1127 17:01:29.893354 4954 generic.go:334] "Generic (PLEG): container finished" podID="84b84d83-c981-4800-b508-f856f5abd02b" containerID="bc4691d97f7d1704035cac745f8c7650452ac5e77af6b2749f937c1ba82dafab" exitCode=0 Nov 27 17:01:29 crc kubenswrapper[4954]: I1127 17:01:29.893429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerDied","Data":"bc4691d97f7d1704035cac745f8c7650452ac5e77af6b2749f937c1ba82dafab"} Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.035515 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.180854 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181013 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181057 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181115 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181150 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w226j\" (UniqueName: \"kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181235 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181266 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml\") pod \"84b84d83-c981-4800-b508-f856f5abd02b\" (UID: \"84b84d83-c981-4800-b508-f856f5abd02b\") " Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181342 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.181727 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.182492 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.186412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j" (OuterVolumeSpecName: "kube-api-access-w226j") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "kube-api-access-w226j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.189389 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts" (OuterVolumeSpecName: "scripts") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.218881 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.250783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.269946 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data" (OuterVolumeSpecName: "config-data") pod "84b84d83-c981-4800-b508-f856f5abd02b" (UID: "84b84d83-c981-4800-b508-f856f5abd02b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283906 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283942 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84b84d83-c981-4800-b508-f856f5abd02b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283953 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283966 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w226j\" (UniqueName: \"kubernetes.io/projected/84b84d83-c981-4800-b508-f856f5abd02b-kube-api-access-w226j\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283975 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.283983 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84b84d83-c981-4800-b508-f856f5abd02b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.906771 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84b84d83-c981-4800-b508-f856f5abd02b","Type":"ContainerDied","Data":"dc2fbe873a1ddfc3ae6776dc53ce5eb58f9087bf3d28a560b70877a6369b7130"} Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.906847 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.907557 4954 scope.go:117] "RemoveContainer" containerID="00390434b1ae9ea92847994662ab149f7120796038c73bd89091184c9a19bfd2" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.950276 4954 scope.go:117] "RemoveContainer" containerID="6fc73ffa866ae248f9047aecc39f1937daf3fb9aed187a9fbe0a0105e4ebf762" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.951483 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.973000 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.981719 4954 scope.go:117] "RemoveContainer" containerID="bc4691d97f7d1704035cac745f8c7650452ac5e77af6b2749f937c1ba82dafab" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.983517 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:30 crc kubenswrapper[4954]: E1127 17:01:30.983875 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-notification-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.983887 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-notification-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: E1127 17:01:30.983922 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="proxy-httpd" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.983929 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="proxy-httpd" Nov 27 17:01:30 crc kubenswrapper[4954]: E1127 17:01:30.983958 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-central-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.983965 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-central-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: E1127 17:01:30.983981 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="sg-core" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.983988 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="sg-core" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.984166 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="sg-core" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.984175 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-central-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.984184 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="proxy-httpd" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.984203 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b84d83-c981-4800-b508-f856f5abd02b" containerName="ceilometer-notification-agent" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.985770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.988554 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:01:30 crc kubenswrapper[4954]: I1127 17:01:30.988924 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.013743 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.028116 4954 scope.go:117] "RemoveContainer" containerID="6d25f8ca7e86613d4319938e2d5ae3dca903f8fc0fa02e8c4838e1c850d48759" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.100849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.101685 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vbg\" (UniqueName: \"kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.101820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.101911 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.102015 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.102157 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.102262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204736 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204823 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vbg\" (UniqueName: \"kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204948 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.204999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.205051 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.205933 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.206670 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.210325 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.211342 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.212650 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.214932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.224139 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vbg\" (UniqueName: \"kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg\") pod \"ceilometer-0\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.314604 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.846836 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:01:31 crc kubenswrapper[4954]: I1127 17:01:31.925907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerStarted","Data":"4d9b44a2767dc779324c4dc42262ebad032b9d13d3e6347127c86508aa41cf55"} Nov 27 17:01:32 crc kubenswrapper[4954]: I1127 17:01:32.674092 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b84d83-c981-4800-b508-f856f5abd02b" path="/var/lib/kubelet/pods/84b84d83-c981-4800-b508-f856f5abd02b/volumes" Nov 27 17:01:33 crc kubenswrapper[4954]: I1127 17:01:33.952807 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerStarted","Data":"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050"} Nov 27 17:01:33 crc kubenswrapper[4954]: I1127 17:01:33.953500 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerStarted","Data":"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d"} Nov 27 17:01:34 crc kubenswrapper[4954]: E1127 17:01:34.206257 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:01:34 crc kubenswrapper[4954]: I1127 17:01:34.965550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerStarted","Data":"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431"} Nov 27 17:01:38 crc kubenswrapper[4954]: I1127 17:01:38.005995 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerStarted","Data":"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114"} Nov 27 17:01:38 crc kubenswrapper[4954]: I1127 17:01:38.007311 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:01:38 crc kubenswrapper[4954]: I1127 17:01:38.012818 4954 generic.go:334] "Generic (PLEG): container finished" podID="aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" containerID="8d8e3a9508023c2041d5b56d00f6ecb39ac3bfe4529a587f6bd42d04185d2f05" exitCode=0 Nov 27 17:01:38 crc kubenswrapper[4954]: I1127 17:01:38.013019 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zsv66" event={"ID":"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e","Type":"ContainerDied","Data":"8d8e3a9508023c2041d5b56d00f6ecb39ac3bfe4529a587f6bd42d04185d2f05"} Nov 27 17:01:38 crc kubenswrapper[4954]: I1127 17:01:38.059347 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.665245758 podStartE2EDuration="8.059325791s" podCreationTimestamp="2025-11-27 17:01:30 +0000 UTC" firstStartedPulling="2025-11-27 17:01:31.853531995 +0000 UTC m=+1403.870972295" lastFinishedPulling="2025-11-27 17:01:37.247612028 +0000 UTC m=+1409.265052328" observedRunningTime="2025-11-27 17:01:38.040823362 +0000 UTC m=+1410.058263662" watchObservedRunningTime="2025-11-27 17:01:38.059325791 +0000 UTC m=+1410.076766091" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.438713 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.480177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle\") pod \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.480269 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4gq\" (UniqueName: \"kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq\") pod \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.480373 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data\") pod \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.480447 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts\") pod \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\" (UID: \"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e\") " Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.487628 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq" (OuterVolumeSpecName: "kube-api-access-4w4gq") pod "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" (UID: "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e"). InnerVolumeSpecName "kube-api-access-4w4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.489489 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts" (OuterVolumeSpecName: "scripts") pod "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" (UID: "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.518483 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" (UID: "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.535478 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data" (OuterVolumeSpecName: "config-data") pod "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" (UID: "aef92b7c-4671-4ac0-9a4e-f76233eb4c8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.583067 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.583100 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w4gq\" (UniqueName: \"kubernetes.io/projected/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-kube-api-access-4w4gq\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.583113 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:39 crc kubenswrapper[4954]: I1127 17:01:39.583121 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.031527 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zsv66" event={"ID":"aef92b7c-4671-4ac0-9a4e-f76233eb4c8e","Type":"ContainerDied","Data":"a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134"} Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.032041 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a33e54f05f3c2e0b4126d6c7fd3cc78b06b602876627a0f69cfb5b28bb5e6134" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.031607 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zsv66" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.196919 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 17:01:40 crc kubenswrapper[4954]: E1127 17:01:40.197745 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" containerName="nova-cell0-conductor-db-sync" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.197774 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" containerName="nova-cell0-conductor-db-sync" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.198056 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" containerName="nova-cell0-conductor-db-sync" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.199208 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.203741 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t2xkq" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.203867 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.228716 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.297624 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.297745 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5sv\" (UniqueName: \"kubernetes.io/projected/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-kube-api-access-qr5sv\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.297779 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.398996 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.399130 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5sv\" (UniqueName: \"kubernetes.io/projected/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-kube-api-access-qr5sv\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.399155 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.405000 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.406413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.419166 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5sv\" (UniqueName: \"kubernetes.io/projected/b181e3d6-4f0e-40f1-ac14-96bcbb17622a-kube-api-access-qr5sv\") pod \"nova-cell0-conductor-0\" (UID: \"b181e3d6-4f0e-40f1-ac14-96bcbb17622a\") " pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:40 crc kubenswrapper[4954]: I1127 17:01:40.529907 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:41 crc kubenswrapper[4954]: I1127 17:01:41.047953 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 17:01:42 crc kubenswrapper[4954]: I1127 17:01:42.051706 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b181e3d6-4f0e-40f1-ac14-96bcbb17622a","Type":"ContainerStarted","Data":"24795857a69e9c4246224cfb7b56ba28f75848ea4e93f75f37339f377e2473d3"} Nov 27 17:01:42 crc kubenswrapper[4954]: I1127 17:01:42.052056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b181e3d6-4f0e-40f1-ac14-96bcbb17622a","Type":"ContainerStarted","Data":"5d002b2299d77031ee14b16d81a6fea04097ca500fb48261c69fef6c46a5f1f0"} Nov 27 17:01:42 crc kubenswrapper[4954]: I1127 17:01:42.052073 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:42 crc kubenswrapper[4954]: I1127 17:01:42.090074 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.090045225 podStartE2EDuration="2.090045225s" podCreationTimestamp="2025-11-27 17:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:42.07171318 +0000 UTC m=+1414.089153520" watchObservedRunningTime="2025-11-27 17:01:42.090045225 +0000 UTC m=+1414.107485555" Nov 27 17:01:44 crc kubenswrapper[4954]: E1127 17:01:44.441390 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:01:50 crc kubenswrapper[4954]: I1127 17:01:50.572745 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.037328 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tw6zj"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.038860 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.042784 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.042948 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.058478 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tw6zj"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.150713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.150750 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.150835 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrg6l\" (UniqueName: \"kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.150888 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.224734 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.226327 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.228932 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.251097 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.252051 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.252121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.252143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.252213 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrg6l\" (UniqueName: \"kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.259362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.260008 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.284162 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.288694 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrg6l\" (UniqueName: \"kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l\") pod \"nova-cell0-cell-mapping-tw6zj\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.337659 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.339274 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.344388 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.354720 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwtf\" (UniqueName: \"kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.354774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.354803 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.354843 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.356617 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.365196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.464282 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwtf\" (UniqueName: \"kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465561 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465668 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465766 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9pcf\" (UniqueName: \"kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465845 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.465923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.466409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.472953 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.476505 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.500971 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwtf\" (UniqueName: \"kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf\") pod \"nova-api-0\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.501041 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.502528 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.513656 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.515211 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.519650 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.529640 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.531203 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.537531 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.549568 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.557784 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.569554 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9pcf\" (UniqueName: \"kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.569623 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.570882 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.570959 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.571377 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.571442 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.576227 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.590432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.618983 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9pcf\" (UniqueName: \"kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf\") pod \"nova-metadata-0\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.623322 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bbm9\" (UniqueName: \"kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672175 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfj8\" (UniqueName: \"kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672269 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672301 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gtn\" (UniqueName: \"kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.672763 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.674985 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.776903 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.778529 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gtn\" (UniqueName: \"kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.778674 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.778763 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.778872 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbm9\" (UniqueName: \"kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.778976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779148 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779263 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779359 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfj8\" (UniqueName: \"kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.779996 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.780314 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.782626 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.783097 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.784334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.784889 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.795282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.795299 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.805063 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfj8\" (UniqueName: \"kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.805562 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.808791 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.808888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbm9\" (UniqueName: \"kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9\") pod \"dnsmasq-dns-865f5d856f-jh528\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.820243 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gtn\" (UniqueName: \"kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn\") pod \"nova-cell1-novncproxy-0\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.851029 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.871036 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:01:51 crc kubenswrapper[4954]: I1127 17:01:51.905072 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.047837 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tw6zj"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.181936 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tw6zj" event={"ID":"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef","Type":"ContainerStarted","Data":"8f1051470ad58e3fd5a18a2b9a54b99334a792c8f03020fcb62fe437c6108093"} Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.248929 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qxwbp"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.260966 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qxwbp"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.261050 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.263548 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.263734 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.334698 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.404392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.404508 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5s6h\" (UniqueName: \"kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.404540 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.404612 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.507776 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5s6h\" (UniqueName: \"kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.508186 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.508272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.508324 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.528607 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.532260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5s6h\" (UniqueName: \"kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.535317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.537478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts\") pod \"nova-cell1-conductor-db-sync-qxwbp\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.571519 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.740884 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.806708 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.815375 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:01:52 crc kubenswrapper[4954]: W1127 17:01:52.815992 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b5ea73_fb76_4d0e_875c_c3c124364550.slice/crio-d8c9f59ea413c59523457d916a2c6cc2f27042266f1cdab6707637254f8560a9 WatchSource:0}: Error finding container d8c9f59ea413c59523457d916a2c6cc2f27042266f1cdab6707637254f8560a9: Status 404 returned error can't find the container with id d8c9f59ea413c59523457d916a2c6cc2f27042266f1cdab6707637254f8560a9 Nov 27 17:01:52 crc kubenswrapper[4954]: I1127 17:01:52.823866 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.202492 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tw6zj" event={"ID":"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef","Type":"ContainerStarted","Data":"ef3ce88a1727514fb33d40dec9dbd723fc71d9e5018cb2b7fd0e8aa3cc02eea1"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.205414 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"442636ac-c001-4f18-8a37-d09c9b6e0dfe","Type":"ContainerStarted","Data":"a0c157ca2aedad59a22c8b6ed86eefb387172f445cbabd1a1913c09ec70bd0ab"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.211117 4954 generic.go:334] "Generic (PLEG): container finished" podID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerID="d76c790328511d79c936c35ae4ffaa123f08ec0d48d36904b7bdac6f557660f1" exitCode=0 Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.211322 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-jh528" event={"ID":"33b5ea73-fb76-4d0e-875c-c3c124364550","Type":"ContainerDied","Data":"d76c790328511d79c936c35ae4ffaa123f08ec0d48d36904b7bdac6f557660f1"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.211368 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-jh528" event={"ID":"33b5ea73-fb76-4d0e-875c-c3c124364550","Type":"ContainerStarted","Data":"d8c9f59ea413c59523457d916a2c6cc2f27042266f1cdab6707637254f8560a9"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.215544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerStarted","Data":"25b5d0655d421c41d30bde4719fcebb8b5d733176db8177819ab8657535e9fb9"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.222026 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerStarted","Data":"8e4a9cbf3411505c5e2b47393353f13c8782fad0a03b40922820e4b552bafc36"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.229328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9081a403-c3ea-4613-a218-ab1ac1f1ed42","Type":"ContainerStarted","Data":"ce369a5cba3f028d77b449feecd63d85c863767f51b4e49dbb8fbfb19717ef6d"} Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.241733 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tw6zj" podStartSLOduration=2.241712758 podStartE2EDuration="2.241712758s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:53.226416417 +0000 UTC m=+1425.243856717" watchObservedRunningTime="2025-11-27 17:01:53.241712758 +0000 UTC m=+1425.259153058" Nov 27 17:01:53 crc kubenswrapper[4954]: W1127 17:01:53.264276 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod167e2351_bc28_488d_86be_a3d038476c57.slice/crio-a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f WatchSource:0}: Error finding container a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f: Status 404 returned error can't find the container with id a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.283309 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qxwbp"] Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.687423 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:01:53 crc kubenswrapper[4954]: I1127 17:01:53.687964 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:01:54 crc kubenswrapper[4954]: I1127 17:01:54.243155 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" event={"ID":"167e2351-bc28-488d-86be-a3d038476c57","Type":"ContainerStarted","Data":"778690843a6a8a382fe3b79b4f2d8c36249677f4882933416f90c8b59bed81bc"} Nov 27 17:01:54 crc kubenswrapper[4954]: I1127 17:01:54.243200 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" event={"ID":"167e2351-bc28-488d-86be-a3d038476c57","Type":"ContainerStarted","Data":"a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f"} Nov 27 17:01:54 crc kubenswrapper[4954]: I1127 17:01:54.255199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-jh528" event={"ID":"33b5ea73-fb76-4d0e-875c-c3c124364550","Type":"ContainerStarted","Data":"cb8c9f9e96ef42abc149b49188e811c2f100693550e38053908439bc537130ae"} Nov 27 17:01:54 crc kubenswrapper[4954]: I1127 17:01:54.268324 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" podStartSLOduration=2.268300885 podStartE2EDuration="2.268300885s" podCreationTimestamp="2025-11-27 17:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:54.267512137 +0000 UTC m=+1426.284952447" watchObservedRunningTime="2025-11-27 17:01:54.268300885 +0000 UTC m=+1426.285741175" Nov 27 17:01:54 crc kubenswrapper[4954]: I1127 17:01:54.346907 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-jh528" podStartSLOduration=3.346879082 podStartE2EDuration="3.346879082s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:54.306513122 +0000 UTC m=+1426.323953422" watchObservedRunningTime="2025-11-27 17:01:54.346879082 +0000 UTC m=+1426.364319382" Nov 27 17:01:54 crc kubenswrapper[4954]: E1127 17:01:54.732557 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:01:55 crc kubenswrapper[4954]: I1127 17:01:55.059213 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:55 crc kubenswrapper[4954]: I1127 17:01:55.074178 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:01:55 crc kubenswrapper[4954]: I1127 17:01:55.264134 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:01:57 crc kubenswrapper[4954]: I1127 17:01:57.304776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9081a403-c3ea-4613-a218-ab1ac1f1ed42","Type":"ContainerStarted","Data":"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294"} Nov 27 17:01:57 crc kubenswrapper[4954]: I1127 17:01:57.305063 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294" gracePeriod=30 Nov 27 17:01:57 crc kubenswrapper[4954]: I1127 17:01:57.311180 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"442636ac-c001-4f18-8a37-d09c9b6e0dfe","Type":"ContainerStarted","Data":"1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45"} Nov 27 17:01:57 crc kubenswrapper[4954]: I1127 17:01:57.329154 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.259099741 podStartE2EDuration="6.329132128s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="2025-11-27 17:01:52.815151219 +0000 UTC m=+1424.832591519" lastFinishedPulling="2025-11-27 17:01:56.885183606 +0000 UTC m=+1428.902623906" observedRunningTime="2025-11-27 17:01:57.320652092 +0000 UTC m=+1429.338092412" watchObservedRunningTime="2025-11-27 17:01:57.329132128 +0000 UTC m=+1429.346572428" Nov 27 17:01:57 crc kubenswrapper[4954]: I1127 17:01:57.350548 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.275429255 podStartE2EDuration="6.350525796s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="2025-11-27 17:01:52.810762752 +0000 UTC m=+1424.828203052" lastFinishedPulling="2025-11-27 17:01:56.885859303 +0000 UTC m=+1428.903299593" observedRunningTime="2025-11-27 17:01:57.340450012 +0000 UTC m=+1429.357890312" watchObservedRunningTime="2025-11-27 17:01:57.350525796 +0000 UTC m=+1429.367966096" Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.323152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerStarted","Data":"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e"} Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.323541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerStarted","Data":"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2"} Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.323309 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-log" containerID="cri-o://6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" gracePeriod=30 Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.323706 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-metadata" containerID="cri-o://aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" gracePeriod=30 Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.326896 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerStarted","Data":"6a9403d61bcd8b3dc18e2526fbb06088d65d4e29e63d9c5274f8626513548ed0"} Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.326929 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerStarted","Data":"e8566ffd34f602562600b8c86c0a06660adc9ddf8997a5f3ac7b3cd41941d16e"} Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.351499 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.04512036 podStartE2EDuration="7.351480422s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="2025-11-27 17:01:52.58851426 +0000 UTC m=+1424.605954560" lastFinishedPulling="2025-11-27 17:01:56.894874322 +0000 UTC m=+1428.912314622" observedRunningTime="2025-11-27 17:01:58.344039712 +0000 UTC m=+1430.361480012" watchObservedRunningTime="2025-11-27 17:01:58.351480422 +0000 UTC m=+1430.368920722" Nov 27 17:01:58 crc kubenswrapper[4954]: I1127 17:01:58.372710 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.853788488 podStartE2EDuration="7.372694937s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="2025-11-27 17:01:52.370112141 +0000 UTC m=+1424.387552441" lastFinishedPulling="2025-11-27 17:01:56.88901859 +0000 UTC m=+1428.906458890" observedRunningTime="2025-11-27 17:01:58.368201148 +0000 UTC m=+1430.385641438" watchObservedRunningTime="2025-11-27 17:01:58.372694937 +0000 UTC m=+1430.390135237" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.042496 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.164599 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9pcf\" (UniqueName: \"kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf\") pod \"74990131-c517-4f20-ba14-3b31d7adfe60\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.165058 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data\") pod \"74990131-c517-4f20-ba14-3b31d7adfe60\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.165119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle\") pod \"74990131-c517-4f20-ba14-3b31d7adfe60\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.165333 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs\") pod \"74990131-c517-4f20-ba14-3b31d7adfe60\" (UID: \"74990131-c517-4f20-ba14-3b31d7adfe60\") " Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.166010 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs" (OuterVolumeSpecName: "logs") pod "74990131-c517-4f20-ba14-3b31d7adfe60" (UID: "74990131-c517-4f20-ba14-3b31d7adfe60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.174203 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf" (OuterVolumeSpecName: "kube-api-access-v9pcf") pod "74990131-c517-4f20-ba14-3b31d7adfe60" (UID: "74990131-c517-4f20-ba14-3b31d7adfe60"). InnerVolumeSpecName "kube-api-access-v9pcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.205801 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74990131-c517-4f20-ba14-3b31d7adfe60" (UID: "74990131-c517-4f20-ba14-3b31d7adfe60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.221698 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data" (OuterVolumeSpecName: "config-data") pod "74990131-c517-4f20-ba14-3b31d7adfe60" (UID: "74990131-c517-4f20-ba14-3b31d7adfe60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.267690 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74990131-c517-4f20-ba14-3b31d7adfe60-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.267736 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9pcf\" (UniqueName: \"kubernetes.io/projected/74990131-c517-4f20-ba14-3b31d7adfe60-kube-api-access-v9pcf\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.267758 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.267774 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74990131-c517-4f20-ba14-3b31d7adfe60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335506 4954 generic.go:334] "Generic (PLEG): container finished" podID="74990131-c517-4f20-ba14-3b31d7adfe60" containerID="aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" exitCode=0 Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335549 4954 generic.go:334] "Generic (PLEG): container finished" podID="74990131-c517-4f20-ba14-3b31d7adfe60" containerID="6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" exitCode=143 Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335558 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerDied","Data":"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2"} Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335612 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335657 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerDied","Data":"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e"} Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74990131-c517-4f20-ba14-3b31d7adfe60","Type":"ContainerDied","Data":"25b5d0655d421c41d30bde4719fcebb8b5d733176db8177819ab8657535e9fb9"} Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.335698 4954 scope.go:117] "RemoveContainer" containerID="aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.376693 4954 scope.go:117] "RemoveContainer" containerID="6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.383247 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.391312 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.416762 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.416948 4954 scope.go:117] "RemoveContainer" containerID="aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" Nov 27 17:01:59 crc kubenswrapper[4954]: E1127 17:01:59.418523 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-log" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.418572 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-log" Nov 27 17:01:59 crc kubenswrapper[4954]: E1127 17:01:59.418610 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-metadata" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.418620 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-metadata" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.418884 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-metadata" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.418916 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" containerName="nova-metadata-log" Nov 27 17:01:59 crc kubenswrapper[4954]: E1127 17:01:59.419861 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2\": container with ID starting with aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2 not found: ID does not exist" containerID="aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.419897 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2"} err="failed to get container status \"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2\": rpc error: code = NotFound desc = could not find container \"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2\": container with ID starting with aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2 not found: ID does not exist" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.419923 4954 scope.go:117] "RemoveContainer" containerID="6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" Nov 27 17:01:59 crc kubenswrapper[4954]: E1127 17:01:59.423357 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e\": container with ID starting with 6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e not found: ID does not exist" containerID="6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.423394 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e"} err="failed to get container status \"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e\": rpc error: code = NotFound desc = could not find container \"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e\": container with ID starting with 6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e not found: ID does not exist" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.423409 4954 scope.go:117] "RemoveContainer" containerID="aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.423843 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2"} err="failed to get container status \"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2\": rpc error: code = NotFound desc = could not find container \"aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2\": container with ID starting with aa8fdd7c5d738dfcc74218c676e23b2fe289e63e116c912dc603bb4fc981bcb2 not found: ID does not exist" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.423858 4954 scope.go:117] "RemoveContainer" containerID="6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.424802 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.425618 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e"} err="failed to get container status \"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e\": rpc error: code = NotFound desc = could not find container \"6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e\": container with ID starting with 6f2b421e46aef5c5f45bacbe06d8d2c9961dea51544512219709d948fa6a437e not found: ID does not exist" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.427626 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.427826 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.455168 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.471861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4h7l\" (UniqueName: \"kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.472125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.472271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.472370 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.472769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.575721 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.575835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4h7l\" (UniqueName: \"kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.575884 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.575955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.575998 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.577517 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.580364 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.591137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.593536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.595267 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4h7l\" (UniqueName: \"kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l\") pod \"nova-metadata-0\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " pod="openstack/nova-metadata-0" Nov 27 17:01:59 crc kubenswrapper[4954]: I1127 17:01:59.742954 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:00 crc kubenswrapper[4954]: I1127 17:02:00.274966 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:00 crc kubenswrapper[4954]: W1127 17:02:00.285548 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04878fbe_4975_407d_a44a_66f5e3e80916.slice/crio-89e46598af4c8e9e64e15a502d47b426f4daecd4d515fe316b1e1cc075f9a82a WatchSource:0}: Error finding container 89e46598af4c8e9e64e15a502d47b426f4daecd4d515fe316b1e1cc075f9a82a: Status 404 returned error can't find the container with id 89e46598af4c8e9e64e15a502d47b426f4daecd4d515fe316b1e1cc075f9a82a Nov 27 17:02:00 crc kubenswrapper[4954]: I1127 17:02:00.353837 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerStarted","Data":"89e46598af4c8e9e64e15a502d47b426f4daecd4d515fe316b1e1cc075f9a82a"} Nov 27 17:02:00 crc kubenswrapper[4954]: I1127 17:02:00.675313 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74990131-c517-4f20-ba14-3b31d7adfe60" path="/var/lib/kubelet/pods/74990131-c517-4f20-ba14-3b31d7adfe60/volumes" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.323258 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.377449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerStarted","Data":"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3"} Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.378018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerStarted","Data":"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77"} Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.430512 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.430482155 podStartE2EDuration="2.430482155s" podCreationTimestamp="2025-11-27 17:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:01.416823004 +0000 UTC m=+1433.434263304" watchObservedRunningTime="2025-11-27 17:02:01.430482155 +0000 UTC m=+1433.447922455" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.551276 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.551322 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.853837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.871914 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.871970 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.903635 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.907213 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.929733 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:02:01 crc kubenswrapper[4954]: I1127 17:02:01.929972 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="dnsmasq-dns" containerID="cri-o://32788bed775a0ee391ceae4acaebc9d6f12eee80046f5dd9d126e2fbdb50616c" gracePeriod=10 Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.398973 4954 generic.go:334] "Generic (PLEG): container finished" podID="ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" containerID="ef3ce88a1727514fb33d40dec9dbd723fc71d9e5018cb2b7fd0e8aa3cc02eea1" exitCode=0 Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.399087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tw6zj" event={"ID":"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef","Type":"ContainerDied","Data":"ef3ce88a1727514fb33d40dec9dbd723fc71d9e5018cb2b7fd0e8aa3cc02eea1"} Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.404771 4954 generic.go:334] "Generic (PLEG): container finished" podID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerID="32788bed775a0ee391ceae4acaebc9d6f12eee80046f5dd9d126e2fbdb50616c" exitCode=0 Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.404866 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" event={"ID":"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622","Type":"ContainerDied","Data":"32788bed775a0ee391ceae4acaebc9d6f12eee80046f5dd9d126e2fbdb50616c"} Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.453183 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.498155 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.539974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.540161 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.540343 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.540463 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.540562 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.540698 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58cj\" (UniqueName: \"kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj\") pod \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\" (UID: \"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622\") " Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.551212 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.551318 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.552485 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj" (OuterVolumeSpecName: "kube-api-access-z58cj") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "kube-api-access-z58cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.607285 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.616530 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.626897 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config" (OuterVolumeSpecName: "config") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.642806 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.642844 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.642858 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58cj\" (UniqueName: \"kubernetes.io/projected/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-kube-api-access-z58cj\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.642869 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.644323 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.702935 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" (UID: "6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.745558 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:02 crc kubenswrapper[4954]: I1127 17:02:02.745607 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.419079 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.419128 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" event={"ID":"6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622","Type":"ContainerDied","Data":"d9c8a63769220c1cfb06f6220d5194b5be77176c9ad09c8a9daad9f2aa3c7455"} Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.419190 4954 scope.go:117] "RemoveContainer" containerID="32788bed775a0ee391ceae4acaebc9d6f12eee80046f5dd9d126e2fbdb50616c" Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.466751 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.474629 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-bp5cb"] Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.476447 4954 scope.go:117] "RemoveContainer" containerID="9c743ef97060da9fb87c2fb358eb8560978969d54e6dedf0927a940fd489e3d9" Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.885539 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.971654 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle\") pod \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.971725 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrg6l\" (UniqueName: \"kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l\") pod \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.971894 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts\") pod \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.971992 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data\") pod \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\" (UID: \"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef\") " Nov 27 17:02:03 crc kubenswrapper[4954]: I1127 17:02:03.990313 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l" (OuterVolumeSpecName: "kube-api-access-wrg6l") pod "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" (UID: "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef"). InnerVolumeSpecName "kube-api-access-wrg6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.007073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts" (OuterVolumeSpecName: "scripts") pod "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" (UID: "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.013027 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" (UID: "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.014673 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data" (OuterVolumeSpecName: "config-data") pod "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" (UID: "ec7940bb-124f-4c0f-b9fd-471a32e4c3ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.074288 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.074327 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.074341 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrg6l\" (UniqueName: \"kubernetes.io/projected/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-kube-api-access-wrg6l\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.074349 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.428897 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tw6zj" event={"ID":"ec7940bb-124f-4c0f-b9fd-471a32e4c3ef","Type":"ContainerDied","Data":"8f1051470ad58e3fd5a18a2b9a54b99334a792c8f03020fcb62fe437c6108093"} Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.429818 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f1051470ad58e3fd5a18a2b9a54b99334a792c8f03020fcb62fe437c6108093" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.429179 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tw6zj" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.432399 4954 generic.go:334] "Generic (PLEG): container finished" podID="167e2351-bc28-488d-86be-a3d038476c57" containerID="778690843a6a8a382fe3b79b4f2d8c36249677f4882933416f90c8b59bed81bc" exitCode=0 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.432434 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" event={"ID":"167e2351-bc28-488d-86be-a3d038476c57","Type":"ContainerDied","Data":"778690843a6a8a382fe3b79b4f2d8c36249677f4882933416f90c8b59bed81bc"} Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.562465 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.562750 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-log" containerID="cri-o://6a9403d61bcd8b3dc18e2526fbb06088d65d4e29e63d9c5274f8626513548ed0" gracePeriod=30 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.563172 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-api" containerID="cri-o://e8566ffd34f602562600b8c86c0a06660adc9ddf8997a5f3ac7b3cd41941d16e" gracePeriod=30 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.583218 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.583428 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerName="nova-scheduler-scheduler" containerID="cri-o://1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" gracePeriod=30 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.613244 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.613558 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-log" containerID="cri-o://2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" gracePeriod=30 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.613598 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-metadata" containerID="cri-o://9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" gracePeriod=30 Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.673743 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" path="/var/lib/kubelet/pods/6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622/volumes" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.743693 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:04 crc kubenswrapper[4954]: I1127 17:02:04.744024 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.011250 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04878fbe_4975_407d_a44a_66f5e3e80916.slice/crio-conmon-9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2964c_faef_4154_b502_1231f6762e37.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04878fbe_4975_407d_a44a_66f5e3e80916.slice/crio-9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.235161 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.304717 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs\") pod \"04878fbe-4975-407d-a44a-66f5e3e80916\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.304826 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4h7l\" (UniqueName: \"kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l\") pod \"04878fbe-4975-407d-a44a-66f5e3e80916\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.304856 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle\") pod \"04878fbe-4975-407d-a44a-66f5e3e80916\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.304993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data\") pod \"04878fbe-4975-407d-a44a-66f5e3e80916\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.305031 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs\") pod \"04878fbe-4975-407d-a44a-66f5e3e80916\" (UID: \"04878fbe-4975-407d-a44a-66f5e3e80916\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.305723 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs" (OuterVolumeSpecName: "logs") pod "04878fbe-4975-407d-a44a-66f5e3e80916" (UID: "04878fbe-4975-407d-a44a-66f5e3e80916"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.313725 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l" (OuterVolumeSpecName: "kube-api-access-l4h7l") pod "04878fbe-4975-407d-a44a-66f5e3e80916" (UID: "04878fbe-4975-407d-a44a-66f5e3e80916"). InnerVolumeSpecName "kube-api-access-l4h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.355778 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data" (OuterVolumeSpecName: "config-data") pod "04878fbe-4975-407d-a44a-66f5e3e80916" (UID: "04878fbe-4975-407d-a44a-66f5e3e80916"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.387217 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "04878fbe-4975-407d-a44a-66f5e3e80916" (UID: "04878fbe-4975-407d-a44a-66f5e3e80916"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.389791 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04878fbe-4975-407d-a44a-66f5e3e80916" (UID: "04878fbe-4975-407d-a44a-66f5e3e80916"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.407222 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04878fbe-4975-407d-a44a-66f5e3e80916-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.407256 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.407267 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4h7l\" (UniqueName: \"kubernetes.io/projected/04878fbe-4975-407d-a44a-66f5e3e80916-kube-api-access-l4h7l\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.407278 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.407291 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04878fbe-4975-407d-a44a-66f5e3e80916-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442817 4954 generic.go:334] "Generic (PLEG): container finished" podID="04878fbe-4975-407d-a44a-66f5e3e80916" containerID="9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" exitCode=0 Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442860 4954 generic.go:334] "Generic (PLEG): container finished" podID="04878fbe-4975-407d-a44a-66f5e3e80916" containerID="2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" exitCode=143 Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442865 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442922 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerDied","Data":"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3"} Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442987 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerDied","Data":"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77"} Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.442999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04878fbe-4975-407d-a44a-66f5e3e80916","Type":"ContainerDied","Data":"89e46598af4c8e9e64e15a502d47b426f4daecd4d515fe316b1e1cc075f9a82a"} Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.443055 4954 scope.go:117] "RemoveContainer" containerID="9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.445480 4954 generic.go:334] "Generic (PLEG): container finished" podID="1817b937-0d8c-4409-b368-bbeb9482446a" containerID="6a9403d61bcd8b3dc18e2526fbb06088d65d4e29e63d9c5274f8626513548ed0" exitCode=143 Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.445600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerDied","Data":"6a9403d61bcd8b3dc18e2526fbb06088d65d4e29e63d9c5274f8626513548ed0"} Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.475515 4954 scope.go:117] "RemoveContainer" containerID="2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.486318 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.509506 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519047 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.519651 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-metadata" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519669 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-metadata" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.519699 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" containerName="nova-manage" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519707 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" containerName="nova-manage" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.519725 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="dnsmasq-dns" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519732 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="dnsmasq-dns" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.519744 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="init" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519751 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="init" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.519790 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-log" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.519797 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-log" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.520023 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" containerName="nova-manage" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.520037 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-log" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.520049 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" containerName="nova-metadata-metadata" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.520070 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="dnsmasq-dns" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.521455 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.524512 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.524926 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.525126 4954 scope.go:117] "RemoveContainer" containerID="9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.532467 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3\": container with ID starting with 9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3 not found: ID does not exist" containerID="9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.532510 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3"} err="failed to get container status \"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3\": rpc error: code = NotFound desc = could not find container \"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3\": container with ID starting with 9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3 not found: ID does not exist" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.532536 4954 scope.go:117] "RemoveContainer" containerID="2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" Nov 27 17:02:05 crc kubenswrapper[4954]: E1127 17:02:05.534483 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77\": container with ID starting with 2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77 not found: ID does not exist" containerID="2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.534530 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77"} err="failed to get container status \"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77\": rpc error: code = NotFound desc = could not find container \"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77\": container with ID starting with 2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77 not found: ID does not exist" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.534560 4954 scope.go:117] "RemoveContainer" containerID="9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.535298 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3"} err="failed to get container status \"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3\": rpc error: code = NotFound desc = could not find container \"9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3\": container with ID starting with 9c266269e3cfa03be866bf4d238174e49c8b1c4747ba5be695cb15cf05fafad3 not found: ID does not exist" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.535321 4954 scope.go:117] "RemoveContainer" containerID="2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.537558 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77"} err="failed to get container status \"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77\": rpc error: code = NotFound desc = could not find container \"2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77\": container with ID starting with 2688ed95bd45521f2c1ce409bdd62c9ae7f04816eecd9f8f4d2da8ed9833ac77 not found: ID does not exist" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.539708 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.612544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7n8m\" (UniqueName: \"kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.612651 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.612689 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.612865 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.612902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.717548 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.717652 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.717985 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.718742 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7n8m\" (UniqueName: \"kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.718861 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.718881 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.724530 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.725255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.725917 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.754331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7n8m\" (UniqueName: \"kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m\") pod \"nova-metadata-0\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.770234 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.820168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts\") pod \"167e2351-bc28-488d-86be-a3d038476c57\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.820220 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data\") pod \"167e2351-bc28-488d-86be-a3d038476c57\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.820278 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle\") pod \"167e2351-bc28-488d-86be-a3d038476c57\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.820351 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5s6h\" (UniqueName: \"kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h\") pod \"167e2351-bc28-488d-86be-a3d038476c57\" (UID: \"167e2351-bc28-488d-86be-a3d038476c57\") " Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.828079 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h" (OuterVolumeSpecName: "kube-api-access-s5s6h") pod "167e2351-bc28-488d-86be-a3d038476c57" (UID: "167e2351-bc28-488d-86be-a3d038476c57"). InnerVolumeSpecName "kube-api-access-s5s6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.830969 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts" (OuterVolumeSpecName: "scripts") pod "167e2351-bc28-488d-86be-a3d038476c57" (UID: "167e2351-bc28-488d-86be-a3d038476c57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.847943 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.884198 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data" (OuterVolumeSpecName: "config-data") pod "167e2351-bc28-488d-86be-a3d038476c57" (UID: "167e2351-bc28-488d-86be-a3d038476c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.888029 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "167e2351-bc28-488d-86be-a3d038476c57" (UID: "167e2351-bc28-488d-86be-a3d038476c57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.923827 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.924133 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5s6h\" (UniqueName: \"kubernetes.io/projected/167e2351-bc28-488d-86be-a3d038476c57-kube-api-access-s5s6h\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.924146 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:05 crc kubenswrapper[4954]: I1127 17:02:05.924156 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167e2351-bc28-488d-86be-a3d038476c57-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.302523 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:06 crc kubenswrapper[4954]: W1127 17:02:06.314118 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a515f1_708a_4b0a_83ed_d28323eabe4a.slice/crio-7c0a7a6445a91e76099cc3ec605a6ae7927f60805cd58c359528630d6d559878 WatchSource:0}: Error finding container 7c0a7a6445a91e76099cc3ec605a6ae7927f60805cd58c359528630d6d559878: Status 404 returned error can't find the container with id 7c0a7a6445a91e76099cc3ec605a6ae7927f60805cd58c359528630d6d559878 Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.463289 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerStarted","Data":"7c0a7a6445a91e76099cc3ec605a6ae7927f60805cd58c359528630d6d559878"} Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.466209 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" event={"ID":"167e2351-bc28-488d-86be-a3d038476c57","Type":"ContainerDied","Data":"a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f"} Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.466244 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21063e91649df8b0210e1ca397346b98bb39bd7376c58fc62efc1f0285a296f" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.466313 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qxwbp" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.525569 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.525813 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" containerName="kube-state-metrics" containerID="cri-o://e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23" gracePeriod=30 Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.578418 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 17:02:06 crc kubenswrapper[4954]: E1127 17:02:06.578864 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167e2351-bc28-488d-86be-a3d038476c57" containerName="nova-cell1-conductor-db-sync" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.578884 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="167e2351-bc28-488d-86be-a3d038476c57" containerName="nova-cell1-conductor-db-sync" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.579097 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="167e2351-bc28-488d-86be-a3d038476c57" containerName="nova-cell1-conductor-db-sync" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.579873 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.591438 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.613050 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.642020 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbz27\" (UniqueName: \"kubernetes.io/projected/025a86e8-034b-4eef-8f20-14141598f0b4-kube-api-access-sbz27\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.642180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.642216 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.673201 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04878fbe-4975-407d-a44a-66f5e3e80916" path="/var/lib/kubelet/pods/04878fbe-4975-407d-a44a-66f5e3e80916/volumes" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.745557 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.745650 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.745720 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbz27\" (UniqueName: \"kubernetes.io/projected/025a86e8-034b-4eef-8f20-14141598f0b4-kube-api-access-sbz27\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.756785 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.763407 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a86e8-034b-4eef-8f20-14141598f0b4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.771078 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbz27\" (UniqueName: \"kubernetes.io/projected/025a86e8-034b-4eef-8f20-14141598f0b4-kube-api-access-sbz27\") pod \"nova-cell1-conductor-0\" (UID: \"025a86e8-034b-4eef-8f20-14141598f0b4\") " pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:06 crc kubenswrapper[4954]: E1127 17:02:06.893096 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 17:02:06 crc kubenswrapper[4954]: E1127 17:02:06.895093 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 17:02:06 crc kubenswrapper[4954]: E1127 17:02:06.896844 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 17:02:06 crc kubenswrapper[4954]: E1127 17:02:06.896877 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerName="nova-scheduler-scheduler" Nov 27 17:02:06 crc kubenswrapper[4954]: I1127 17:02:06.933698 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.022436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.051891 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whxr8\" (UniqueName: \"kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8\") pod \"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f\" (UID: \"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f\") " Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.057057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8" (OuterVolumeSpecName: "kube-api-access-whxr8") pod "c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" (UID: "c6a94711-3e04-42e3-9ec3-6487f0dd3a3f"). InnerVolumeSpecName "kube-api-access-whxr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.154410 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whxr8\" (UniqueName: \"kubernetes.io/projected/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f-kube-api-access-whxr8\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.239769 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-bp5cb" podUID="6cbaf412-0cf5-4f12-9c3e-ec1a6fe20622" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.476626 4954 generic.go:334] "Generic (PLEG): container finished" podID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" containerID="e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23" exitCode=2 Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.476731 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.477359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f","Type":"ContainerDied","Data":"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23"} Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.477404 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c6a94711-3e04-42e3-9ec3-6487f0dd3a3f","Type":"ContainerDied","Data":"a4ba405ace0565e8e77cb3955b4a2802005675ec9645d80dee7c6897b58bd55e"} Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.477421 4954 scope.go:117] "RemoveContainer" containerID="e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.479306 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerStarted","Data":"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f"} Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.479323 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerStarted","Data":"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c"} Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.503432 4954 scope.go:117] "RemoveContainer" containerID="e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23" Nov 27 17:02:07 crc kubenswrapper[4954]: E1127 17:02:07.504530 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23\": container with ID starting with e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23 not found: ID does not exist" containerID="e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.504591 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23"} err="failed to get container status \"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23\": rpc error: code = NotFound desc = could not find container \"e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23\": container with ID starting with e2ad6e0434745a4a771fb2b29503ecf1330028f7e41696c028b955ac584e2a23 not found: ID does not exist" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.520196 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.520173785 podStartE2EDuration="2.520173785s" podCreationTimestamp="2025-11-27 17:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:07.511090584 +0000 UTC m=+1439.528530894" watchObservedRunningTime="2025-11-27 17:02:07.520173785 +0000 UTC m=+1439.537614095" Nov 27 17:02:07 crc kubenswrapper[4954]: W1127 17:02:07.542460 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod025a86e8_034b_4eef_8f20_14141598f0b4.slice/crio-b60aadc7080fe94aeb4de6088013b9513a8031f8061a899797237eb64466aa90 WatchSource:0}: Error finding container b60aadc7080fe94aeb4de6088013b9513a8031f8061a899797237eb64466aa90: Status 404 returned error can't find the container with id b60aadc7080fe94aeb4de6088013b9513a8031f8061a899797237eb64466aa90 Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.546671 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.558773 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.572542 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.585257 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:07 crc kubenswrapper[4954]: E1127 17:02:07.585692 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" containerName="kube-state-metrics" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.585708 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" containerName="kube-state-metrics" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.585915 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" containerName="kube-state-metrics" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.586685 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.589014 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.589649 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.593214 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.663260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.663321 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqsll\" (UniqueName: \"kubernetes.io/projected/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-api-access-hqsll\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.663532 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.663792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.766596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.766688 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.766755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.766782 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqsll\" (UniqueName: \"kubernetes.io/projected/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-api-access-hqsll\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.771851 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.772610 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.781259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:07 crc kubenswrapper[4954]: I1127 17:02:07.790886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqsll\" (UniqueName: \"kubernetes.io/projected/1ba0b816-c965-4474-b923-73f572cdc1ab-kube-api-access-hqsll\") pod \"kube-state-metrics-0\" (UID: \"1ba0b816-c965-4474-b923-73f572cdc1ab\") " pod="openstack/kube-state-metrics-0" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.014874 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.491672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"025a86e8-034b-4eef-8f20-14141598f0b4","Type":"ContainerStarted","Data":"540f03d0a47431f0faa9a770eb342e270a6cb98f3b8f4ccb7014507221b9650d"} Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.492090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"025a86e8-034b-4eef-8f20-14141598f0b4","Type":"ContainerStarted","Data":"b60aadc7080fe94aeb4de6088013b9513a8031f8061a899797237eb64466aa90"} Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.492162 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.493728 4954 generic.go:334] "Generic (PLEG): container finished" podID="1817b937-0d8c-4409-b368-bbeb9482446a" containerID="e8566ffd34f602562600b8c86c0a06660adc9ddf8997a5f3ac7b3cd41941d16e" exitCode=0 Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.493812 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerDied","Data":"e8566ffd34f602562600b8c86c0a06660adc9ddf8997a5f3ac7b3cd41941d16e"} Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.493856 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1817b937-0d8c-4409-b368-bbeb9482446a","Type":"ContainerDied","Data":"8e4a9cbf3411505c5e2b47393353f13c8782fad0a03b40922820e4b552bafc36"} Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.493876 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4a9cbf3411505c5e2b47393353f13c8782fad0a03b40922820e4b552bafc36" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.508903 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.508885463 podStartE2EDuration="2.508885463s" podCreationTimestamp="2025-11-27 17:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:08.507695744 +0000 UTC m=+1440.525136064" watchObservedRunningTime="2025-11-27 17:02:08.508885463 +0000 UTC m=+1440.526325763" Nov 27 17:02:08 crc kubenswrapper[4954]: W1127 17:02:08.545702 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba0b816_c965_4474_b923_73f572cdc1ab.slice/crio-1a2909fc8a9cc177d9a306ca62b9657db2bb5ab2e2e68d855f673da89c6544ff WatchSource:0}: Error finding container 1a2909fc8a9cc177d9a306ca62b9657db2bb5ab2e2e68d855f673da89c6544ff: Status 404 returned error can't find the container with id 1a2909fc8a9cc177d9a306ca62b9657db2bb5ab2e2e68d855f673da89c6544ff Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.549785 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.618926 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.689415 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzwtf\" (UniqueName: \"kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf\") pod \"1817b937-0d8c-4409-b368-bbeb9482446a\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.689969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle\") pod \"1817b937-0d8c-4409-b368-bbeb9482446a\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.690210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data\") pod \"1817b937-0d8c-4409-b368-bbeb9482446a\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.690291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs\") pod \"1817b937-0d8c-4409-b368-bbeb9482446a\" (UID: \"1817b937-0d8c-4409-b368-bbeb9482446a\") " Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.692503 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs" (OuterVolumeSpecName: "logs") pod "1817b937-0d8c-4409-b368-bbeb9482446a" (UID: "1817b937-0d8c-4409-b368-bbeb9482446a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.693223 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a94711-3e04-42e3-9ec3-6487f0dd3a3f" path="/var/lib/kubelet/pods/c6a94711-3e04-42e3-9ec3-6487f0dd3a3f/volumes" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.705306 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf" (OuterVolumeSpecName: "kube-api-access-qzwtf") pod "1817b937-0d8c-4409-b368-bbeb9482446a" (UID: "1817b937-0d8c-4409-b368-bbeb9482446a"). InnerVolumeSpecName "kube-api-access-qzwtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.731676 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1817b937-0d8c-4409-b368-bbeb9482446a" (UID: "1817b937-0d8c-4409-b368-bbeb9482446a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.753886 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data" (OuterVolumeSpecName: "config-data") pod "1817b937-0d8c-4409-b368-bbeb9482446a" (UID: "1817b937-0d8c-4409-b368-bbeb9482446a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.793964 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.794017 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1817b937-0d8c-4409-b368-bbeb9482446a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.794032 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1817b937-0d8c-4409-b368-bbeb9482446a-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.794043 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzwtf\" (UniqueName: \"kubernetes.io/projected/1817b937-0d8c-4409-b368-bbeb9482446a-kube-api-access-qzwtf\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.811615 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.812032 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="sg-core" containerID="cri-o://1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431" gracePeriod=30 Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.812194 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="proxy-httpd" containerID="cri-o://8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114" gracePeriod=30 Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.812325 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-notification-agent" containerID="cri-o://be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d" gracePeriod=30 Nov 27 17:02:08 crc kubenswrapper[4954]: I1127 17:02:08.812459 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-central-agent" containerID="cri-o://dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050" gracePeriod=30 Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518182 4954 generic.go:334] "Generic (PLEG): container finished" podID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerID="8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114" exitCode=0 Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518655 4954 generic.go:334] "Generic (PLEG): container finished" podID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerID="1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431" exitCode=2 Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518665 4954 generic.go:334] "Generic (PLEG): container finished" podID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerID="dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050" exitCode=0 Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerDied","Data":"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114"} Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518725 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerDied","Data":"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431"} Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.518740 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerDied","Data":"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050"} Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.521118 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ba0b816-c965-4474-b923-73f572cdc1ab","Type":"ContainerStarted","Data":"1a2909fc8a9cc177d9a306ca62b9657db2bb5ab2e2e68d855f673da89c6544ff"} Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.522419 4954 generic.go:334] "Generic (PLEG): container finished" podID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerID="1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" exitCode=0 Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.522513 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.522501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"442636ac-c001-4f18-8a37-d09c9b6e0dfe","Type":"ContainerDied","Data":"1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45"} Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.571237 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.592400 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.616811 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:09 crc kubenswrapper[4954]: E1127 17:02:09.617236 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-api" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.617255 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-api" Nov 27 17:02:09 crc kubenswrapper[4954]: E1127 17:02:09.617282 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-log" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.617289 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-log" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.617486 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-log" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.617506 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" containerName="nova-api-api" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.618452 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.620221 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.631391 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.693641 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.712002 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cd9j\" (UniqueName: \"kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.712102 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.712131 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.712255 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.813977 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rfj8\" (UniqueName: \"kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8\") pod \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814366 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle\") pod \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814465 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data\") pod \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\" (UID: \"442636ac-c001-4f18-8a37-d09c9b6e0dfe\") " Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814805 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814842 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814934 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.814971 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cd9j\" (UniqueName: \"kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.817080 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.822284 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.827665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8" (OuterVolumeSpecName: "kube-api-access-6rfj8") pod "442636ac-c001-4f18-8a37-d09c9b6e0dfe" (UID: "442636ac-c001-4f18-8a37-d09c9b6e0dfe"). InnerVolumeSpecName "kube-api-access-6rfj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.828471 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.841478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cd9j\" (UniqueName: \"kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j\") pod \"nova-api-0\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " pod="openstack/nova-api-0" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.851213 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "442636ac-c001-4f18-8a37-d09c9b6e0dfe" (UID: "442636ac-c001-4f18-8a37-d09c9b6e0dfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.851762 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data" (OuterVolumeSpecName: "config-data") pod "442636ac-c001-4f18-8a37-d09c9b6e0dfe" (UID: "442636ac-c001-4f18-8a37-d09c9b6e0dfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.920768 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.920812 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rfj8\" (UniqueName: \"kubernetes.io/projected/442636ac-c001-4f18-8a37-d09c9b6e0dfe-kube-api-access-6rfj8\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:09 crc kubenswrapper[4954]: I1127 17:02:09.920826 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442636ac-c001-4f18-8a37-d09c9b6e0dfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.012104 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.205133 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331338 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7vbg\" (UniqueName: \"kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331418 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331508 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331616 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331647 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331675 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331766 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd\") pod \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\" (UID: \"ad07ddc6-4615-4a60-a765-fc2313fb5d0b\") " Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.331988 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.332360 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.332411 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.339560 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts" (OuterVolumeSpecName: "scripts") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.346209 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg" (OuterVolumeSpecName: "kube-api-access-n7vbg") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "kube-api-access-n7vbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.375755 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.435909 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7vbg\" (UniqueName: \"kubernetes.io/projected/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-kube-api-access-n7vbg\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.435950 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.435963 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.435975 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.441015 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.445497 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data" (OuterVolumeSpecName: "config-data") pod "ad07ddc6-4615-4a60-a765-fc2313fb5d0b" (UID: "ad07ddc6-4615-4a60-a765-fc2313fb5d0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.532923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"442636ac-c001-4f18-8a37-d09c9b6e0dfe","Type":"ContainerDied","Data":"a0c157ca2aedad59a22c8b6ed86eefb387172f445cbabd1a1913c09ec70bd0ab"} Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.532980 4954 scope.go:117] "RemoveContainer" containerID="1ab874324ed7415f3e971d83dd2a9b2248d12e611124c9066216673dc8103f45" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.533095 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.537199 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.537280 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad07ddc6-4615-4a60-a765-fc2313fb5d0b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.539781 4954 generic.go:334] "Generic (PLEG): container finished" podID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerID="be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d" exitCode=0 Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.539822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerDied","Data":"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d"} Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.539847 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad07ddc6-4615-4a60-a765-fc2313fb5d0b","Type":"ContainerDied","Data":"4d9b44a2767dc779324c4dc42262ebad032b9d13d3e6347127c86508aa41cf55"} Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.539904 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.557185 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.574351 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.586549 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.615606 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.627966 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.640719 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.641664 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-central-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.641683 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-central-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.641746 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="sg-core" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.641756 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="sg-core" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.641788 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerName="nova-scheduler-scheduler" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.641796 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerName="nova-scheduler-scheduler" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.641804 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="proxy-httpd" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.641811 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="proxy-httpd" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.641830 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-notification-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.641836 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-notification-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642199 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="sg-core" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642225 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-central-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642237 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="ceilometer-notification-agent" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642246 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" containerName="nova-scheduler-scheduler" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642255 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" containerName="proxy-httpd" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.642683 4954 scope.go:117] "RemoveContainer" containerID="8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.644003 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.646783 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.646969 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.647006 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.653456 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.655122 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.662936 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.709251 4954 scope.go:117] "RemoveContainer" containerID="1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.715124 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1817b937-0d8c-4409-b368-bbeb9482446a" path="/var/lib/kubelet/pods/1817b937-0d8c-4409-b368-bbeb9482446a/volumes" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.716009 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442636ac-c001-4f18-8a37-d09c9b6e0dfe" path="/var/lib/kubelet/pods/442636ac-c001-4f18-8a37-d09c9b6e0dfe/volumes" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.716997 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad07ddc6-4615-4a60-a765-fc2313fb5d0b" path="/var/lib/kubelet/pods/ad07ddc6-4615-4a60-a765-fc2313fb5d0b/volumes" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.718505 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.730015 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.739918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.739976 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740312 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740389 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740472 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxthm\" (UniqueName: \"kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740618 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmrt\" (UniqueName: \"kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.740669 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.749974 4954 scope.go:117] "RemoveContainer" containerID="be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.782797 4954 scope.go:117] "RemoveContainer" containerID="dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.825430 4954 scope.go:117] "RemoveContainer" containerID="8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.832246 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114\": container with ID starting with 8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114 not found: ID does not exist" containerID="8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.832294 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114"} err="failed to get container status \"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114\": rpc error: code = NotFound desc = could not find container \"8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114\": container with ID starting with 8836e5bedf8b1797dd44f63e5e33cea2d080d221a62a3a1fc13dcbaea04eb114 not found: ID does not exist" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.832323 4954 scope.go:117] "RemoveContainer" containerID="1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.833084 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431\": container with ID starting with 1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431 not found: ID does not exist" containerID="1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.833125 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431"} err="failed to get container status \"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431\": rpc error: code = NotFound desc = could not find container \"1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431\": container with ID starting with 1ab53fbb8e6c29dae625ebf7ba14e5864e381682f941b20ef2c0711095b0c431 not found: ID does not exist" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.833154 4954 scope.go:117] "RemoveContainer" containerID="be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.835079 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d\": container with ID starting with be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d not found: ID does not exist" containerID="be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.835110 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d"} err="failed to get container status \"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d\": rpc error: code = NotFound desc = could not find container \"be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d\": container with ID starting with be47cea0f1aca23af8d7fbe5530f58883bdd7b2ab14e08fcfff0a56ff9d0777d not found: ID does not exist" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.835140 4954 scope.go:117] "RemoveContainer" containerID="dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050" Nov 27 17:02:10 crc kubenswrapper[4954]: E1127 17:02:10.835491 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050\": container with ID starting with dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050 not found: ID does not exist" containerID="dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.835513 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050"} err="failed to get container status \"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050\": rpc error: code = NotFound desc = could not find container \"dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050\": container with ID starting with dea43d3fc6e5b98beb77744cf290e06b1a578674499fd5e7457b8b341711d050 not found: ID does not exist" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842439 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxthm\" (UniqueName: \"kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842504 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmrt\" (UniqueName: \"kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842529 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842571 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842676 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842701 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842728 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.842777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.844052 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.845424 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.847490 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.849184 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.849252 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.849293 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.850136 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.850315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.852755 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.852908 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.856009 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.861386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxthm\" (UniqueName: \"kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm\") pod \"nova-scheduler-0\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.863049 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmrt\" (UniqueName: \"kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt\") pod \"ceilometer-0\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.991925 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:10 crc kubenswrapper[4954]: I1127 17:02:10.999445 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.502640 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:11 crc kubenswrapper[4954]: W1127 17:02:11.503237 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a31c550_c189_42b0_9b57_1ffa0d69b180.slice/crio-096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa WatchSource:0}: Error finding container 096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa: Status 404 returned error can't find the container with id 096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.552855 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerStarted","Data":"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f"} Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.552892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerStarted","Data":"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405"} Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.552902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerStarted","Data":"06117361d3a4d6c40c3bc7fb5428c1f3eda038f8e09e4d73e574bae55bf08dfc"} Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.555364 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ba0b816-c965-4474-b923-73f572cdc1ab","Type":"ContainerStarted","Data":"c62b1eadd28c0811d73ae3671f379cc3966076ace9d174cf326a86d43e377c39"} Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.555972 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.557474 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerStarted","Data":"096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa"} Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.572706 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.572689377 podStartE2EDuration="2.572689377s" podCreationTimestamp="2025-11-27 17:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:11.570526315 +0000 UTC m=+1443.587966605" watchObservedRunningTime="2025-11-27 17:02:11.572689377 +0000 UTC m=+1443.590129677" Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.591446 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.551412137 podStartE2EDuration="4.591426122s" podCreationTimestamp="2025-11-27 17:02:07 +0000 UTC" firstStartedPulling="2025-11-27 17:02:08.548016622 +0000 UTC m=+1440.565456922" lastFinishedPulling="2025-11-27 17:02:10.588030607 +0000 UTC m=+1442.605470907" observedRunningTime="2025-11-27 17:02:11.587763543 +0000 UTC m=+1443.605203833" watchObservedRunningTime="2025-11-27 17:02:11.591426122 +0000 UTC m=+1443.608866422" Nov 27 17:02:11 crc kubenswrapper[4954]: W1127 17:02:11.607529 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod413cbe5a_ca44_4d13_ac32_68ff849a4e41.slice/crio-af6c7b087f85971987b0cd7c2b2ef69d1669d38c7ea619f42f6db2000e371bb0 WatchSource:0}: Error finding container af6c7b087f85971987b0cd7c2b2ef69d1669d38c7ea619f42f6db2000e371bb0: Status 404 returned error can't find the container with id af6c7b087f85971987b0cd7c2b2ef69d1669d38c7ea619f42f6db2000e371bb0 Nov 27 17:02:11 crc kubenswrapper[4954]: I1127 17:02:11.609385 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:12 crc kubenswrapper[4954]: I1127 17:02:12.141814 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 27 17:02:12 crc kubenswrapper[4954]: I1127 17:02:12.569153 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerStarted","Data":"b7434fc61404163fce7eeff0f752c65353a9cb203e6d49623258ac0b38d162c8"} Nov 27 17:02:12 crc kubenswrapper[4954]: I1127 17:02:12.572682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"413cbe5a-ca44-4d13-ac32-68ff849a4e41","Type":"ContainerStarted","Data":"df582c2fcef321e036169d52ad6ef70bd82d1f11b8920e2a548cf2f782a49b3c"} Nov 27 17:02:12 crc kubenswrapper[4954]: I1127 17:02:12.572711 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"413cbe5a-ca44-4d13-ac32-68ff849a4e41","Type":"ContainerStarted","Data":"af6c7b087f85971987b0cd7c2b2ef69d1669d38c7ea619f42f6db2000e371bb0"} Nov 27 17:02:12 crc kubenswrapper[4954]: I1127 17:02:12.594023 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.594000067 podStartE2EDuration="2.594000067s" podCreationTimestamp="2025-11-27 17:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:12.590127783 +0000 UTC m=+1444.607568083" watchObservedRunningTime="2025-11-27 17:02:12.594000067 +0000 UTC m=+1444.611440377" Nov 27 17:02:14 crc kubenswrapper[4954]: I1127 17:02:14.593901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerStarted","Data":"54044b1877e09ff186089b7d3d45b6a8af05c37b117389c34d43a1899893c0d5"} Nov 27 17:02:15 crc kubenswrapper[4954]: I1127 17:02:15.604191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerStarted","Data":"67adc7278a60297cd96e209646877a1cf4c2602d7783c4b6b4926f79ebb30f22"} Nov 27 17:02:15 crc kubenswrapper[4954]: I1127 17:02:15.850038 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 17:02:15 crc kubenswrapper[4954]: I1127 17:02:15.851410 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 17:02:15 crc kubenswrapper[4954]: I1127 17:02:15.999798 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 17:02:16 crc kubenswrapper[4954]: I1127 17:02:16.889922 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:16 crc kubenswrapper[4954]: I1127 17:02:16.890112 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:17 crc kubenswrapper[4954]: I1127 17:02:17.650231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerStarted","Data":"c8db0f80cc8e084c62ef2269a947511d008a0b24f0b041f05e7868bcba22bb72"} Nov 27 17:02:17 crc kubenswrapper[4954]: I1127 17:02:17.650485 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:02:17 crc kubenswrapper[4954]: I1127 17:02:17.684737 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7163206349999998 podStartE2EDuration="7.684700939s" podCreationTimestamp="2025-11-27 17:02:10 +0000 UTC" firstStartedPulling="2025-11-27 17:02:11.505450446 +0000 UTC m=+1443.522890756" lastFinishedPulling="2025-11-27 17:02:16.47383076 +0000 UTC m=+1448.491271060" observedRunningTime="2025-11-27 17:02:17.673504367 +0000 UTC m=+1449.690944677" watchObservedRunningTime="2025-11-27 17:02:17.684700939 +0000 UTC m=+1449.702141259" Nov 27 17:02:18 crc kubenswrapper[4954]: I1127 17:02:18.043614 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 17:02:20 crc kubenswrapper[4954]: I1127 17:02:20.012252 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:02:20 crc kubenswrapper[4954]: I1127 17:02:20.012679 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:02:21 crc kubenswrapper[4954]: I1127 17:02:21.000039 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 17:02:21 crc kubenswrapper[4954]: I1127 17:02:21.054021 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 17:02:21 crc kubenswrapper[4954]: I1127 17:02:21.096791 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:21 crc kubenswrapper[4954]: I1127 17:02:21.096971 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:02:21 crc kubenswrapper[4954]: I1127 17:02:21.729561 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 17:02:23 crc kubenswrapper[4954]: I1127 17:02:23.687223 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:02:23 crc kubenswrapper[4954]: I1127 17:02:23.687476 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:02:23 crc kubenswrapper[4954]: I1127 17:02:23.687520 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:02:23 crc kubenswrapper[4954]: I1127 17:02:23.688177 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:02:23 crc kubenswrapper[4954]: I1127 17:02:23.688239 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f" gracePeriod=600 Nov 27 17:02:24 crc kubenswrapper[4954]: I1127 17:02:24.726092 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f" exitCode=0 Nov 27 17:02:24 crc kubenswrapper[4954]: I1127 17:02:24.726640 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f"} Nov 27 17:02:24 crc kubenswrapper[4954]: I1127 17:02:24.726714 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f"} Nov 27 17:02:24 crc kubenswrapper[4954]: I1127 17:02:24.726751 4954 scope.go:117] "RemoveContainer" containerID="9612382de1b535d3c643f2ac5d6cc1b599dc89b245b1720c9d36c1ba8e2a8513" Nov 27 17:02:25 crc kubenswrapper[4954]: I1127 17:02:25.855762 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 17:02:25 crc kubenswrapper[4954]: I1127 17:02:25.857525 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 17:02:25 crc kubenswrapper[4954]: I1127 17:02:25.862185 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 17:02:26 crc kubenswrapper[4954]: I1127 17:02:26.757268 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.742965 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.758469 4954 generic.go:334] "Generic (PLEG): container finished" podID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" containerID="468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294" exitCode=137 Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.758530 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.758590 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9081a403-c3ea-4613-a218-ab1ac1f1ed42","Type":"ContainerDied","Data":"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294"} Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.758657 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9081a403-c3ea-4613-a218-ab1ac1f1ed42","Type":"ContainerDied","Data":"ce369a5cba3f028d77b449feecd63d85c863767f51b4e49dbb8fbfb19717ef6d"} Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.758682 4954 scope.go:117] "RemoveContainer" containerID="468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.781168 4954 scope.go:117] "RemoveContainer" containerID="468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294" Nov 27 17:02:27 crc kubenswrapper[4954]: E1127 17:02:27.781611 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294\": container with ID starting with 468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294 not found: ID does not exist" containerID="468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.781738 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294"} err="failed to get container status \"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294\": rpc error: code = NotFound desc = could not find container \"468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294\": container with ID starting with 468aa50e8c00e4a930369314581826099902622a493f3b92719a130c21762294 not found: ID does not exist" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.927772 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle\") pod \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.928135 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8gtn\" (UniqueName: \"kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn\") pod \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.928179 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data\") pod \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\" (UID: \"9081a403-c3ea-4613-a218-ab1ac1f1ed42\") " Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.933429 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn" (OuterVolumeSpecName: "kube-api-access-x8gtn") pod "9081a403-c3ea-4613-a218-ab1ac1f1ed42" (UID: "9081a403-c3ea-4613-a218-ab1ac1f1ed42"). InnerVolumeSpecName "kube-api-access-x8gtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.960812 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9081a403-c3ea-4613-a218-ab1ac1f1ed42" (UID: "9081a403-c3ea-4613-a218-ab1ac1f1ed42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:27 crc kubenswrapper[4954]: I1127 17:02:27.963460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data" (OuterVolumeSpecName: "config-data") pod "9081a403-c3ea-4613-a218-ab1ac1f1ed42" (UID: "9081a403-c3ea-4613-a218-ab1ac1f1ed42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.030746 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.030822 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8gtn\" (UniqueName: \"kubernetes.io/projected/9081a403-c3ea-4613-a218-ab1ac1f1ed42-kube-api-access-x8gtn\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.030840 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9081a403-c3ea-4613-a218-ab1ac1f1ed42-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.094997 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.104313 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.115367 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:02:28 crc kubenswrapper[4954]: E1127 17:02:28.115794 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.115816 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.116042 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.116731 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.119307 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.119783 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.120226 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.132659 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbd8v\" (UniqueName: \"kubernetes.io/projected/a8c31305-69a0-477a-958f-d91daa9fe501-kube-api-access-pbd8v\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.132707 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.132729 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.132749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.132998 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.135486 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.236502 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.236722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbd8v\" (UniqueName: \"kubernetes.io/projected/a8c31305-69a0-477a-958f-d91daa9fe501-kube-api-access-pbd8v\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.236762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.236792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.236835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.241607 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.241710 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.242509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.243512 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c31305-69a0-477a-958f-d91daa9fe501-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.258498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbd8v\" (UniqueName: \"kubernetes.io/projected/a8c31305-69a0-477a-958f-d91daa9fe501-kube-api-access-pbd8v\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8c31305-69a0-477a-958f-d91daa9fe501\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.438855 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.676089 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9081a403-c3ea-4613-a218-ab1ac1f1ed42" path="/var/lib/kubelet/pods/9081a403-c3ea-4613-a218-ab1ac1f1ed42/volumes" Nov 27 17:02:28 crc kubenswrapper[4954]: I1127 17:02:28.912051 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 17:02:29 crc kubenswrapper[4954]: I1127 17:02:29.779502 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8c31305-69a0-477a-958f-d91daa9fe501","Type":"ContainerStarted","Data":"4f96e036e60e1a22832a6f4ad274d86a3e2efa2ac4b507bc8169228fb7cdaaab"} Nov 27 17:02:29 crc kubenswrapper[4954]: I1127 17:02:29.780114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8c31305-69a0-477a-958f-d91daa9fe501","Type":"ContainerStarted","Data":"165ed023e187b0eadda4cd2d5270456069bbf8d25e62c5e4b687fb5395964c11"} Nov 27 17:02:29 crc kubenswrapper[4954]: I1127 17:02:29.804780 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.804757379 podStartE2EDuration="1.804757379s" podCreationTimestamp="2025-11-27 17:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:29.799341517 +0000 UTC m=+1461.816781817" watchObservedRunningTime="2025-11-27 17:02:29.804757379 +0000 UTC m=+1461.822197689" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.016964 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.018694 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.018730 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.030115 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.787837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 17:02:30 crc kubenswrapper[4954]: I1127 17:02:30.791861 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.056823 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.060138 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.098503 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.113075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.113367 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.113493 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.113530 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpl65\" (UniqueName: \"kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.113559 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.115332 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219225 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpl65\" (UniqueName: \"kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.219323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.220412 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.220978 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.221561 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.222125 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.227815 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.242885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpl65\" (UniqueName: \"kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65\") pod \"dnsmasq-dns-5c7b6c5df9-gzs8b\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.390030 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:31 crc kubenswrapper[4954]: W1127 17:02:31.942248 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37726ba_6010_4e19_a3ad_df091a9cc21e.slice/crio-aac5be05547a599353270241eb7c2f0ba7487c7deb5ce18363735590ada3bd4f WatchSource:0}: Error finding container aac5be05547a599353270241eb7c2f0ba7487c7deb5ce18363735590ada3bd4f: Status 404 returned error can't find the container with id aac5be05547a599353270241eb7c2f0ba7487c7deb5ce18363735590ada3bd4f Nov 27 17:02:31 crc kubenswrapper[4954]: I1127 17:02:31.949794 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:02:32 crc kubenswrapper[4954]: I1127 17:02:32.812525 4954 generic.go:334] "Generic (PLEG): container finished" podID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerID="21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61" exitCode=0 Nov 27 17:02:32 crc kubenswrapper[4954]: I1127 17:02:32.812625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" event={"ID":"e37726ba-6010-4e19-a3ad-df091a9cc21e","Type":"ContainerDied","Data":"21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61"} Nov 27 17:02:32 crc kubenswrapper[4954]: I1127 17:02:32.813477 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" event={"ID":"e37726ba-6010-4e19-a3ad-df091a9cc21e","Type":"ContainerStarted","Data":"aac5be05547a599353270241eb7c2f0ba7487c7deb5ce18363735590ada3bd4f"} Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.233897 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.234173 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-central-agent" containerID="cri-o://b7434fc61404163fce7eeff0f752c65353a9cb203e6d49623258ac0b38d162c8" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.234333 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="proxy-httpd" containerID="cri-o://c8db0f80cc8e084c62ef2269a947511d008a0b24f0b041f05e7868bcba22bb72" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.234389 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="sg-core" containerID="cri-o://67adc7278a60297cd96e209646877a1cf4c2602d7783c4b6b4926f79ebb30f22" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.234391 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-notification-agent" containerID="cri-o://54044b1877e09ff186089b7d3d45b6a8af05c37b117389c34d43a1899893c0d5" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.245116 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.439470 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.674353 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844625 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerID="c8db0f80cc8e084c62ef2269a947511d008a0b24f0b041f05e7868bcba22bb72" exitCode=0 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844664 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerID="67adc7278a60297cd96e209646877a1cf4c2602d7783c4b6b4926f79ebb30f22" exitCode=2 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844675 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerID="b7434fc61404163fce7eeff0f752c65353a9cb203e6d49623258ac0b38d162c8" exitCode=0 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844717 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerDied","Data":"c8db0f80cc8e084c62ef2269a947511d008a0b24f0b041f05e7868bcba22bb72"} Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844746 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerDied","Data":"67adc7278a60297cd96e209646877a1cf4c2602d7783c4b6b4926f79ebb30f22"} Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.844760 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerDied","Data":"b7434fc61404163fce7eeff0f752c65353a9cb203e6d49623258ac0b38d162c8"} Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.846707 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" event={"ID":"e37726ba-6010-4e19-a3ad-df091a9cc21e","Type":"ContainerStarted","Data":"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777"} Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.846781 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-log" containerID="cri-o://a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.846828 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-api" containerID="cri-o://967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f" gracePeriod=30 Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.846954 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:33 crc kubenswrapper[4954]: I1127 17:02:33.880672 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" podStartSLOduration=3.880649699 podStartE2EDuration="3.880649699s" podCreationTimestamp="2025-11-27 17:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:33.87040733 +0000 UTC m=+1465.887847630" watchObservedRunningTime="2025-11-27 17:02:33.880649699 +0000 UTC m=+1465.898090009" Nov 27 17:02:34 crc kubenswrapper[4954]: I1127 17:02:34.869218 4954 generic.go:334] "Generic (PLEG): container finished" podID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerID="a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405" exitCode=143 Nov 27 17:02:34 crc kubenswrapper[4954]: I1127 17:02:34.869546 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerDied","Data":"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405"} Nov 27 17:02:35 crc kubenswrapper[4954]: I1127 17:02:35.882521 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerID="54044b1877e09ff186089b7d3d45b6a8af05c37b117389c34d43a1899893c0d5" exitCode=0 Nov 27 17:02:35 crc kubenswrapper[4954]: I1127 17:02:35.882588 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerDied","Data":"54044b1877e09ff186089b7d3d45b6a8af05c37b117389c34d43a1899893c0d5"} Nov 27 17:02:35 crc kubenswrapper[4954]: I1127 17:02:35.882898 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a31c550-c189-42b0-9b57-1ffa0d69b180","Type":"ContainerDied","Data":"096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa"} Nov 27 17:02:35 crc kubenswrapper[4954]: I1127 17:02:35.882910 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="096896ac879177cf8cf4b554a8a465fad42d4f9d5a3c85b368e5bffa9c1f96aa" Nov 27 17:02:35 crc kubenswrapper[4954]: I1127 17:02:35.913138 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027065 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027199 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027229 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027257 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmrt\" (UniqueName: \"kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027283 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027339 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027380 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.027414 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml\") pod \"9a31c550-c189-42b0-9b57-1ffa0d69b180\" (UID: \"9a31c550-c189-42b0-9b57-1ffa0d69b180\") " Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.028957 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.029238 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.040947 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts" (OuterVolumeSpecName: "scripts") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.041171 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt" (OuterVolumeSpecName: "kube-api-access-mpmrt") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "kube-api-access-mpmrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.070048 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.100127 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.121734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130019 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130055 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130069 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmrt\" (UniqueName: \"kubernetes.io/projected/9a31c550-c189-42b0-9b57-1ffa0d69b180-kube-api-access-mpmrt\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130081 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130092 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a31c550-c189-42b0-9b57-1ffa0d69b180-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130102 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.130113 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.137984 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data" (OuterVolumeSpecName: "config-data") pod "9a31c550-c189-42b0-9b57-1ffa0d69b180" (UID: "9a31c550-c189-42b0-9b57-1ffa0d69b180"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.235564 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a31c550-c189-42b0-9b57-1ffa0d69b180-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.891781 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.916512 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.931889 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.943925 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:36 crc kubenswrapper[4954]: E1127 17:02:36.944393 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-central-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944417 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-central-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: E1127 17:02:36.944448 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-notification-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-notification-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: E1127 17:02:36.944473 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="sg-core" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944479 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="sg-core" Nov 27 17:02:36 crc kubenswrapper[4954]: E1127 17:02:36.944491 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="proxy-httpd" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944497 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="proxy-httpd" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944703 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-notification-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944723 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="sg-core" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944741 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="proxy-httpd" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.944752 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" containerName="ceilometer-central-agent" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.948122 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.951121 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.951242 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.951522 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 17:02:36 crc kubenswrapper[4954]: I1127 17:02:36.956996 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049410 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-scripts\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049456 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-log-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-run-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049537 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049571 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5h25\" (UniqueName: \"kubernetes.io/projected/61397bb6-588c-4c10-bd06-c7010f737605-kube-api-access-p5h25\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-config-data\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.049690 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.151753 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.151822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5h25\" (UniqueName: \"kubernetes.io/projected/61397bb6-588c-4c10-bd06-c7010f737605-kube-api-access-p5h25\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.151886 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-config-data\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.151908 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.156898 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.159531 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-scripts\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.159595 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-log-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.159654 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-run-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.160006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-log-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.160147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.160159 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61397bb6-588c-4c10-bd06-c7010f737605-run-httpd\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.164298 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.164319 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.165728 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-config-data\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.166493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61397bb6-588c-4c10-bd06-c7010f737605-scripts\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.180113 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5h25\" (UniqueName: \"kubernetes.io/projected/61397bb6-588c-4c10-bd06-c7010f737605-kube-api-access-p5h25\") pod \"ceilometer-0\" (UID: \"61397bb6-588c-4c10-bd06-c7010f737605\") " pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.278935 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.400084 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.469400 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data\") pod \"70396bbd-83d0-430e-bb81-44ea354afb3e\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.469476 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle\") pod \"70396bbd-83d0-430e-bb81-44ea354afb3e\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.469576 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs\") pod \"70396bbd-83d0-430e-bb81-44ea354afb3e\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.469684 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cd9j\" (UniqueName: \"kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j\") pod \"70396bbd-83d0-430e-bb81-44ea354afb3e\" (UID: \"70396bbd-83d0-430e-bb81-44ea354afb3e\") " Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.475707 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs" (OuterVolumeSpecName: "logs") pod "70396bbd-83d0-430e-bb81-44ea354afb3e" (UID: "70396bbd-83d0-430e-bb81-44ea354afb3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.476237 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j" (OuterVolumeSpecName: "kube-api-access-4cd9j") pod "70396bbd-83d0-430e-bb81-44ea354afb3e" (UID: "70396bbd-83d0-430e-bb81-44ea354afb3e"). InnerVolumeSpecName "kube-api-access-4cd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.508813 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70396bbd-83d0-430e-bb81-44ea354afb3e" (UID: "70396bbd-83d0-430e-bb81-44ea354afb3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.530169 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data" (OuterVolumeSpecName: "config-data") pod "70396bbd-83d0-430e-bb81-44ea354afb3e" (UID: "70396bbd-83d0-430e-bb81-44ea354afb3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.573618 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70396bbd-83d0-430e-bb81-44ea354afb3e-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.573667 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cd9j\" (UniqueName: \"kubernetes.io/projected/70396bbd-83d0-430e-bb81-44ea354afb3e-kube-api-access-4cd9j\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.573681 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.573695 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70396bbd-83d0-430e-bb81-44ea354afb3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.762249 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 17:02:37 crc kubenswrapper[4954]: W1127 17:02:37.763194 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61397bb6_588c_4c10_bd06_c7010f737605.slice/crio-a5a40948988dd91d26d6ccc161a5a8c809f3a02ec894892a2666e65701223c61 WatchSource:0}: Error finding container a5a40948988dd91d26d6ccc161a5a8c809f3a02ec894892a2666e65701223c61: Status 404 returned error can't find the container with id a5a40948988dd91d26d6ccc161a5a8c809f3a02ec894892a2666e65701223c61 Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.766689 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.914370 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61397bb6-588c-4c10-bd06-c7010f737605","Type":"ContainerStarted","Data":"a5a40948988dd91d26d6ccc161a5a8c809f3a02ec894892a2666e65701223c61"} Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.916303 4954 generic.go:334] "Generic (PLEG): container finished" podID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerID="967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f" exitCode=0 Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.916371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerDied","Data":"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f"} Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.916404 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70396bbd-83d0-430e-bb81-44ea354afb3e","Type":"ContainerDied","Data":"06117361d3a4d6c40c3bc7fb5428c1f3eda038f8e09e4d73e574bae55bf08dfc"} Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.916427 4954 scope.go:117] "RemoveContainer" containerID="967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.916821 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.950161 4954 scope.go:117] "RemoveContainer" containerID="a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.978751 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.986075 4954 scope.go:117] "RemoveContainer" containerID="967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f" Nov 27 17:02:37 crc kubenswrapper[4954]: E1127 17:02:37.986530 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f\": container with ID starting with 967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f not found: ID does not exist" containerID="967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.986560 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f"} err="failed to get container status \"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f\": rpc error: code = NotFound desc = could not find container \"967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f\": container with ID starting with 967467b2d7bca82679180667438cb5863f2e996cf3d95ef7428668eb29a6654f not found: ID does not exist" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.986597 4954 scope.go:117] "RemoveContainer" containerID="a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405" Nov 27 17:02:37 crc kubenswrapper[4954]: E1127 17:02:37.986968 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405\": container with ID starting with a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405 not found: ID does not exist" containerID="a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405" Nov 27 17:02:37 crc kubenswrapper[4954]: I1127 17:02:37.986990 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405"} err="failed to get container status \"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405\": rpc error: code = NotFound desc = could not find container \"a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405\": container with ID starting with a09a1c3e663a59c896981659b45a0ab06fe09b10212c541ddc7c1da034322405 not found: ID does not exist" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.004056 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.032167 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:38 crc kubenswrapper[4954]: E1127 17:02:38.032598 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-api" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.032615 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-api" Nov 27 17:02:38 crc kubenswrapper[4954]: E1127 17:02:38.032635 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-log" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.032642 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-log" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.032847 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-api" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.032867 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" containerName="nova-api-log" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.033915 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.035609 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.036009 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.036195 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.046095 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.086442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7djw\" (UniqueName: \"kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.086544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.086566 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.086699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.086853 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.087019 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188316 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188358 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7djw\" (UniqueName: \"kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188431 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.188464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.189877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.195638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.196206 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.196649 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.198063 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.232183 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7djw\" (UniqueName: \"kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw\") pod \"nova-api-0\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.356697 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.439032 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.460795 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.679776 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70396bbd-83d0-430e-bb81-44ea354afb3e" path="/var/lib/kubelet/pods/70396bbd-83d0-430e-bb81-44ea354afb3e/volumes" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.680981 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a31c550-c189-42b0-9b57-1ffa0d69b180" path="/var/lib/kubelet/pods/9a31c550-c189-42b0-9b57-1ffa0d69b180/volumes" Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.858264 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:38 crc kubenswrapper[4954]: W1127 17:02:38.861035 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf9644a4_85c3_4116_a097_ab29123d3841.slice/crio-e44a1b7b7723b3f8c9e3390273e542af71524cfbab2f9e33fcabe9c08aead564 WatchSource:0}: Error finding container e44a1b7b7723b3f8c9e3390273e542af71524cfbab2f9e33fcabe9c08aead564: Status 404 returned error can't find the container with id e44a1b7b7723b3f8c9e3390273e542af71524cfbab2f9e33fcabe9c08aead564 Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.927157 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerStarted","Data":"e44a1b7b7723b3f8c9e3390273e542af71524cfbab2f9e33fcabe9c08aead564"} Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.931812 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61397bb6-588c-4c10-bd06-c7010f737605","Type":"ContainerStarted","Data":"5679e8efc33e69fc81eb5bb1e36d64ca92179dfaf075040949f6ed345ca75e6e"} Nov 27 17:02:38 crc kubenswrapper[4954]: I1127 17:02:38.947931 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.173834 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5bh4g"] Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.175249 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.177794 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.177795 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.215822 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5bh4g"] Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.317703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.318046 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.318119 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgh5h\" (UniqueName: \"kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.318174 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.420184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgh5h\" (UniqueName: \"kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.420260 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.420368 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.420392 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.424814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.425601 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.428234 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.442245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgh5h\" (UniqueName: \"kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h\") pod \"nova-cell1-cell-mapping-5bh4g\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.562965 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.940376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerStarted","Data":"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c"} Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.940794 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerStarted","Data":"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779"} Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.944701 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61397bb6-588c-4c10-bd06-c7010f737605","Type":"ContainerStarted","Data":"88f42088ebbc6a9c0ab0de0460969a44a801ccb46c94c89983a39a2a59e47e27"} Nov 27 17:02:39 crc kubenswrapper[4954]: I1127 17:02:39.963132 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.963112992 podStartE2EDuration="2.963112992s" podCreationTimestamp="2025-11-27 17:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:39.95723469 +0000 UTC m=+1471.974674990" watchObservedRunningTime="2025-11-27 17:02:39.963112992 +0000 UTC m=+1471.980553292" Nov 27 17:02:40 crc kubenswrapper[4954]: I1127 17:02:40.026419 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5bh4g"] Nov 27 17:02:40 crc kubenswrapper[4954]: I1127 17:02:40.955301 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61397bb6-588c-4c10-bd06-c7010f737605","Type":"ContainerStarted","Data":"17b2dbe0665044849d5c4c676f306c33a583229d7d49dd52578f4a2a916cc613"} Nov 27 17:02:40 crc kubenswrapper[4954]: I1127 17:02:40.959256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5bh4g" event={"ID":"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b","Type":"ContainerStarted","Data":"f32f179ed7c02cdbc2c6dbb8c6b36316cf799405042c286785ca98bf2a4c25ba"} Nov 27 17:02:40 crc kubenswrapper[4954]: I1127 17:02:40.959467 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5bh4g" event={"ID":"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b","Type":"ContainerStarted","Data":"ced18df953101ce3354ceb63c7b2db61dcccbb03b8dc03c7037796233b965e91"} Nov 27 17:02:40 crc kubenswrapper[4954]: I1127 17:02:40.978530 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5bh4g" podStartSLOduration=1.978515118 podStartE2EDuration="1.978515118s" podCreationTimestamp="2025-11-27 17:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:40.972026031 +0000 UTC m=+1472.989466351" watchObservedRunningTime="2025-11-27 17:02:40.978515118 +0000 UTC m=+1472.995955418" Nov 27 17:02:41 crc kubenswrapper[4954]: I1127 17:02:41.391143 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:02:41 crc kubenswrapper[4954]: I1127 17:02:41.449916 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:02:41 crc kubenswrapper[4954]: I1127 17:02:41.450127 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-jh528" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="dnsmasq-dns" containerID="cri-o://cb8c9f9e96ef42abc149b49188e811c2f100693550e38053908439bc537130ae" gracePeriod=10 Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.014922 4954 generic.go:334] "Generic (PLEG): container finished" podID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerID="cb8c9f9e96ef42abc149b49188e811c2f100693550e38053908439bc537130ae" exitCode=0 Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.015644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-jh528" event={"ID":"33b5ea73-fb76-4d0e-875c-c3c124364550","Type":"ContainerDied","Data":"cb8c9f9e96ef42abc149b49188e811c2f100693550e38053908439bc537130ae"} Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.176657 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192586 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192662 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192698 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bbm9\" (UniqueName: \"kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192804 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192877 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.192925 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0\") pod \"33b5ea73-fb76-4d0e-875c-c3c124364550\" (UID: \"33b5ea73-fb76-4d0e-875c-c3c124364550\") " Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.241759 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9" (OuterVolumeSpecName: "kube-api-access-4bbm9") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "kube-api-access-4bbm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.296268 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bbm9\" (UniqueName: \"kubernetes.io/projected/33b5ea73-fb76-4d0e-875c-c3c124364550-kube-api-access-4bbm9\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.319220 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.322144 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.327250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.329609 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.341664 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config" (OuterVolumeSpecName: "config") pod "33b5ea73-fb76-4d0e-875c-c3c124364550" (UID: "33b5ea73-fb76-4d0e-875c-c3c124364550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.397855 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.398082 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.398193 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.398275 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:42 crc kubenswrapper[4954]: I1127 17:02:42.398338 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b5ea73-fb76-4d0e-875c-c3c124364550-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.028879 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61397bb6-588c-4c10-bd06-c7010f737605","Type":"ContainerStarted","Data":"eeb2deffbb2db8e70bbf68a2490f2c5e5bfdbc0332bd242c5ffa4ed354bcaea6"} Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.030705 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.033001 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-jh528" event={"ID":"33b5ea73-fb76-4d0e-875c-c3c124364550","Type":"ContainerDied","Data":"d8c9f59ea413c59523457d916a2c6cc2f27042266f1cdab6707637254f8560a9"} Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.033049 4954 scope.go:117] "RemoveContainer" containerID="cb8c9f9e96ef42abc149b49188e811c2f100693550e38053908439bc537130ae" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.033184 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-jh528" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.056744 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.927382242 podStartE2EDuration="7.056725609s" podCreationTimestamp="2025-11-27 17:02:36 +0000 UTC" firstStartedPulling="2025-11-27 17:02:37.766454436 +0000 UTC m=+1469.783894736" lastFinishedPulling="2025-11-27 17:02:41.895797803 +0000 UTC m=+1473.913238103" observedRunningTime="2025-11-27 17:02:43.055498211 +0000 UTC m=+1475.072938531" watchObservedRunningTime="2025-11-27 17:02:43.056725609 +0000 UTC m=+1475.074165909" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.061954 4954 scope.go:117] "RemoveContainer" containerID="d76c790328511d79c936c35ae4ffaa123f08ec0d48d36904b7bdac6f557660f1" Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.091194 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:02:43 crc kubenswrapper[4954]: I1127 17:02:43.100867 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-jh528"] Nov 27 17:02:44 crc kubenswrapper[4954]: I1127 17:02:44.685133 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" path="/var/lib/kubelet/pods/33b5ea73-fb76-4d0e-875c-c3c124364550/volumes" Nov 27 17:02:46 crc kubenswrapper[4954]: I1127 17:02:46.066859 4954 generic.go:334] "Generic (PLEG): container finished" podID="396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" containerID="f32f179ed7c02cdbc2c6dbb8c6b36316cf799405042c286785ca98bf2a4c25ba" exitCode=0 Nov 27 17:02:46 crc kubenswrapper[4954]: I1127 17:02:46.066915 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5bh4g" event={"ID":"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b","Type":"ContainerDied","Data":"f32f179ed7c02cdbc2c6dbb8c6b36316cf799405042c286785ca98bf2a4c25ba"} Nov 27 17:02:46 crc kubenswrapper[4954]: I1127 17:02:46.855486 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-865f5d856f-jh528" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: i/o timeout" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.413916 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.494939 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data\") pod \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.494994 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts\") pod \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.495029 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgh5h\" (UniqueName: \"kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h\") pod \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.495067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle\") pod \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\" (UID: \"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b\") " Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.500437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h" (OuterVolumeSpecName: "kube-api-access-jgh5h") pod "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" (UID: "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b"). InnerVolumeSpecName "kube-api-access-jgh5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.506268 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts" (OuterVolumeSpecName: "scripts") pod "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" (UID: "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.524861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data" (OuterVolumeSpecName: "config-data") pod "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" (UID: "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.528702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" (UID: "396b3047-b624-43f0-9dc1-6c8ba6ffaf7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.599396 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.599439 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.599452 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgh5h\" (UniqueName: \"kubernetes.io/projected/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-kube-api-access-jgh5h\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:47 crc kubenswrapper[4954]: I1127 17:02:47.599465 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.086908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5bh4g" event={"ID":"396b3047-b624-43f0-9dc1-6c8ba6ffaf7b","Type":"ContainerDied","Data":"ced18df953101ce3354ceb63c7b2db61dcccbb03b8dc03c7037796233b965e91"} Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.087210 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ced18df953101ce3354ceb63c7b2db61dcccbb03b8dc03c7037796233b965e91" Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.086992 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5bh4g" Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.302857 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.303345 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-log" containerID="cri-o://e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" gracePeriod=30 Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.303455 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-api" containerID="cri-o://e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" gracePeriod=30 Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.314662 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.314938 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" containerName="nova-scheduler-scheduler" containerID="cri-o://df582c2fcef321e036169d52ad6ef70bd82d1f11b8920e2a548cf2f782a49b3c" gracePeriod=30 Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.337191 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.337412 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" containerID="cri-o://73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c" gracePeriod=30 Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.337691 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" containerID="cri-o://415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f" gracePeriod=30 Nov 27 17:02:48 crc kubenswrapper[4954]: I1127 17:02:48.847473 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022086 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022247 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022272 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022315 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022405 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7djw\" (UniqueName: \"kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw\") pod \"cf9644a4-85c3-4116-a097-ab29123d3841\" (UID: \"cf9644a4-85c3-4116-a097-ab29123d3841\") " Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.022810 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs" (OuterVolumeSpecName: "logs") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.027551 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw" (OuterVolumeSpecName: "kube-api-access-f7djw") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "kube-api-access-f7djw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.047565 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.052980 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data" (OuterVolumeSpecName: "config-data") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.069478 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.070689 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf9644a4-85c3-4116-a097-ab29123d3841" (UID: "cf9644a4-85c3-4116-a097-ab29123d3841"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.101883 4954 generic.go:334] "Generic (PLEG): container finished" podID="cf9644a4-85c3-4116-a097-ab29123d3841" containerID="e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" exitCode=0 Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.102098 4954 generic.go:334] "Generic (PLEG): container finished" podID="cf9644a4-85c3-4116-a097-ab29123d3841" containerID="e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" exitCode=143 Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.101951 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.101932 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerDied","Data":"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c"} Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.102551 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerDied","Data":"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779"} Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.102564 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf9644a4-85c3-4116-a097-ab29123d3841","Type":"ContainerDied","Data":"e44a1b7b7723b3f8c9e3390273e542af71524cfbab2f9e33fcabe9c08aead564"} Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.102590 4954 scope.go:117] "RemoveContainer" containerID="e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.106727 4954 generic.go:334] "Generic (PLEG): container finished" podID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerID="73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c" exitCode=143 Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.106770 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerDied","Data":"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c"} Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124831 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124872 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124885 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124896 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf9644a4-85c3-4116-a097-ab29123d3841-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124911 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9644a4-85c3-4116-a097-ab29123d3841-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.124919 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7djw\" (UniqueName: \"kubernetes.io/projected/cf9644a4-85c3-4116-a097-ab29123d3841-kube-api-access-f7djw\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.127411 4954 scope.go:117] "RemoveContainer" containerID="e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.136836 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.154904 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.169668 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.170135 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-api" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170160 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-api" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.170176 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="init" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170184 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="init" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.170204 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" containerName="nova-manage" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170215 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" containerName="nova-manage" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.170249 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-log" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170258 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-log" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.170273 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="dnsmasq-dns" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170281 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="dnsmasq-dns" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170515 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b5ea73-fb76-4d0e-875c-c3c124364550" containerName="dnsmasq-dns" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170528 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-api" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" containerName="nova-api-log" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.170562 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" containerName="nova-manage" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.171103 4954 scope.go:117] "RemoveContainer" containerID="e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.171557 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c\": container with ID starting with e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c not found: ID does not exist" containerID="e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.171620 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c"} err="failed to get container status \"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c\": rpc error: code = NotFound desc = could not find container \"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c\": container with ID starting with e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c not found: ID does not exist" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.171650 4954 scope.go:117] "RemoveContainer" containerID="e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.171755 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: E1127 17:02:49.171998 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779\": container with ID starting with e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779 not found: ID does not exist" containerID="e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.172053 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779"} err="failed to get container status \"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779\": rpc error: code = NotFound desc = could not find container \"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779\": container with ID starting with e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779 not found: ID does not exist" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.172088 4954 scope.go:117] "RemoveContainer" containerID="e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.172307 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c"} err="failed to get container status \"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c\": rpc error: code = NotFound desc = could not find container \"e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c\": container with ID starting with e2cc49aefc1f6b85bb2c9814979ce6395e5178047dc3ac7e712a606ef40dff7c not found: ID does not exist" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.172325 4954 scope.go:117] "RemoveContainer" containerID="e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.173616 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779"} err="failed to get container status \"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779\": rpc error: code = NotFound desc = could not find container \"e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779\": container with ID starting with e8d4d5f4f757a1cdf66415f1ab0f30408656c5ab58e68a72e40ca253bf246779 not found: ID does not exist" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.173836 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.173848 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.179744 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.182845 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.327918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-config-data\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.328080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6662243a-d2bd-4571-8e27-6b923a367942-logs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.328101 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.328122 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.328258 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6f6\" (UniqueName: \"kubernetes.io/projected/6662243a-d2bd-4571-8e27-6b923a367942-kube-api-access-sf6f6\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.328302 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-public-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.429915 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-config-data\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430283 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6662243a-d2bd-4571-8e27-6b923a367942-logs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430307 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430370 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6f6\" (UniqueName: \"kubernetes.io/projected/6662243a-d2bd-4571-8e27-6b923a367942-kube-api-access-sf6f6\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430392 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-public-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.430743 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6662243a-d2bd-4571-8e27-6b923a367942-logs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.433793 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.433877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-public-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.434333 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-config-data\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.435016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6662243a-d2bd-4571-8e27-6b923a367942-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.449898 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6f6\" (UniqueName: \"kubernetes.io/projected/6662243a-d2bd-4571-8e27-6b923a367942-kube-api-access-sf6f6\") pod \"nova-api-0\" (UID: \"6662243a-d2bd-4571-8e27-6b923a367942\") " pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.503590 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 17:02:49 crc kubenswrapper[4954]: W1127 17:02:49.987740 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6662243a_d2bd_4571_8e27_6b923a367942.slice/crio-54c8f10fc5227dbc3b2e44635ff327cdbaf617d4a3e4232084773ad29197848b WatchSource:0}: Error finding container 54c8f10fc5227dbc3b2e44635ff327cdbaf617d4a3e4232084773ad29197848b: Status 404 returned error can't find the container with id 54c8f10fc5227dbc3b2e44635ff327cdbaf617d4a3e4232084773ad29197848b Nov 27 17:02:49 crc kubenswrapper[4954]: I1127 17:02:49.999751 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.134947 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6662243a-d2bd-4571-8e27-6b923a367942","Type":"ContainerStarted","Data":"54c8f10fc5227dbc3b2e44635ff327cdbaf617d4a3e4232084773ad29197848b"} Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.137567 4954 generic.go:334] "Generic (PLEG): container finished" podID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" containerID="df582c2fcef321e036169d52ad6ef70bd82d1f11b8920e2a548cf2f782a49b3c" exitCode=0 Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.137624 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"413cbe5a-ca44-4d13-ac32-68ff849a4e41","Type":"ContainerDied","Data":"df582c2fcef321e036169d52ad6ef70bd82d1f11b8920e2a548cf2f782a49b3c"} Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.292823 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.450281 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data\") pod \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.450346 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle\") pod \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.450403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxthm\" (UniqueName: \"kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm\") pod \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\" (UID: \"413cbe5a-ca44-4d13-ac32-68ff849a4e41\") " Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.457942 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm" (OuterVolumeSpecName: "kube-api-access-vxthm") pod "413cbe5a-ca44-4d13-ac32-68ff849a4e41" (UID: "413cbe5a-ca44-4d13-ac32-68ff849a4e41"). InnerVolumeSpecName "kube-api-access-vxthm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.484196 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data" (OuterVolumeSpecName: "config-data") pod "413cbe5a-ca44-4d13-ac32-68ff849a4e41" (UID: "413cbe5a-ca44-4d13-ac32-68ff849a4e41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.496747 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "413cbe5a-ca44-4d13-ac32-68ff849a4e41" (UID: "413cbe5a-ca44-4d13-ac32-68ff849a4e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.553295 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.553336 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413cbe5a-ca44-4d13-ac32-68ff849a4e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.553348 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxthm\" (UniqueName: \"kubernetes.io/projected/413cbe5a-ca44-4d13-ac32-68ff849a4e41-kube-api-access-vxthm\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:50 crc kubenswrapper[4954]: I1127 17:02:50.672157 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9644a4-85c3-4116-a097-ab29123d3841" path="/var/lib/kubelet/pods/cf9644a4-85c3-4116-a097-ab29123d3841/volumes" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.148758 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6662243a-d2bd-4571-8e27-6b923a367942","Type":"ContainerStarted","Data":"e72318133d5199b44e637a73cc466761cf3d68a901f80d327d09f28aa8a33f6f"} Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.149113 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6662243a-d2bd-4571-8e27-6b923a367942","Type":"ContainerStarted","Data":"bd31239caaebd92957f9ff73eebc1896502f3a7aba965bbf7d4ef04bcca9c65b"} Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.150207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"413cbe5a-ca44-4d13-ac32-68ff849a4e41","Type":"ContainerDied","Data":"af6c7b087f85971987b0cd7c2b2ef69d1669d38c7ea619f42f6db2000e371bb0"} Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.150239 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.150254 4954 scope.go:117] "RemoveContainer" containerID="df582c2fcef321e036169d52ad6ef70bd82d1f11b8920e2a548cf2f782a49b3c" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.172206 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.172190739 podStartE2EDuration="2.172190739s" podCreationTimestamp="2025-11-27 17:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:51.171325747 +0000 UTC m=+1483.188766057" watchObservedRunningTime="2025-11-27 17:02:51.172190739 +0000 UTC m=+1483.189631039" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.194587 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.214602 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.224664 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:51 crc kubenswrapper[4954]: E1127 17:02:51.225204 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" containerName="nova-scheduler-scheduler" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.225225 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" containerName="nova-scheduler-scheduler" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.225449 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" containerName="nova-scheduler-scheduler" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.226338 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.230106 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.233019 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.366903 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzcwj\" (UniqueName: \"kubernetes.io/projected/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-kube-api-access-zzcwj\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.366964 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.367000 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-config-data\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.463090 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:57326->10.217.0.192:8775: read: connection reset by peer" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.463105 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:57312->10.217.0.192:8775: read: connection reset by peer" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.469079 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzcwj\" (UniqueName: \"kubernetes.io/projected/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-kube-api-access-zzcwj\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.469130 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.469176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-config-data\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.474240 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.482685 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-config-data\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.485254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzcwj\" (UniqueName: \"kubernetes.io/projected/99c33ed6-9c2c-4eb0-be67-68c19d5479a7-kube-api-access-zzcwj\") pod \"nova-scheduler-0\" (UID: \"99c33ed6-9c2c-4eb0-be67-68c19d5479a7\") " pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.556894 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.874333 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.983616 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs\") pod \"10a515f1-708a-4b0a-83ed-d28323eabe4a\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.983738 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7n8m\" (UniqueName: \"kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m\") pod \"10a515f1-708a-4b0a-83ed-d28323eabe4a\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.983830 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs\") pod \"10a515f1-708a-4b0a-83ed-d28323eabe4a\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.983966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data\") pod \"10a515f1-708a-4b0a-83ed-d28323eabe4a\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.984073 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle\") pod \"10a515f1-708a-4b0a-83ed-d28323eabe4a\" (UID: \"10a515f1-708a-4b0a-83ed-d28323eabe4a\") " Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.984758 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs" (OuterVolumeSpecName: "logs") pod "10a515f1-708a-4b0a-83ed-d28323eabe4a" (UID: "10a515f1-708a-4b0a-83ed-d28323eabe4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.985058 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a515f1-708a-4b0a-83ed-d28323eabe4a-logs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:51 crc kubenswrapper[4954]: I1127 17:02:51.989493 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m" (OuterVolumeSpecName: "kube-api-access-k7n8m") pod "10a515f1-708a-4b0a-83ed-d28323eabe4a" (UID: "10a515f1-708a-4b0a-83ed-d28323eabe4a"). InnerVolumeSpecName "kube-api-access-k7n8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.020303 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a515f1-708a-4b0a-83ed-d28323eabe4a" (UID: "10a515f1-708a-4b0a-83ed-d28323eabe4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.025205 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data" (OuterVolumeSpecName: "config-data") pod "10a515f1-708a-4b0a-83ed-d28323eabe4a" (UID: "10a515f1-708a-4b0a-83ed-d28323eabe4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.051007 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "10a515f1-708a-4b0a-83ed-d28323eabe4a" (UID: "10a515f1-708a-4b0a-83ed-d28323eabe4a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.087289 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.087321 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.087338 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a515f1-708a-4b0a-83ed-d28323eabe4a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.087350 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7n8m\" (UniqueName: \"kubernetes.io/projected/10a515f1-708a-4b0a-83ed-d28323eabe4a-kube-api-access-k7n8m\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.101081 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.163372 4954 generic.go:334] "Generic (PLEG): container finished" podID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerID="415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f" exitCode=0 Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.163460 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.164102 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerDied","Data":"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f"} Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.164945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a515f1-708a-4b0a-83ed-d28323eabe4a","Type":"ContainerDied","Data":"7c0a7a6445a91e76099cc3ec605a6ae7927f60805cd58c359528630d6d559878"} Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.165116 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99c33ed6-9c2c-4eb0-be67-68c19d5479a7","Type":"ContainerStarted","Data":"dbcbf2b0bc75d4d649bb75aa5bbe0655e2b0840488fcd0aeea5af221104c570d"} Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.165037 4954 scope.go:117] "RemoveContainer" containerID="415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.209806 4954 scope.go:117] "RemoveContainer" containerID="73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.226608 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.235677 4954 scope.go:117] "RemoveContainer" containerID="415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f" Nov 27 17:02:52 crc kubenswrapper[4954]: E1127 17:02:52.236204 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f\": container with ID starting with 415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f not found: ID does not exist" containerID="415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.236250 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f"} err="failed to get container status \"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f\": rpc error: code = NotFound desc = could not find container \"415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f\": container with ID starting with 415c47aa6f4e49b61e0b32010bf7b9755a2091da540359b129b80c86d87cd84f not found: ID does not exist" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.236303 4954 scope.go:117] "RemoveContainer" containerID="73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c" Nov 27 17:02:52 crc kubenswrapper[4954]: E1127 17:02:52.238488 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c\": container with ID starting with 73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c not found: ID does not exist" containerID="73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.238537 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c"} err="failed to get container status \"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c\": rpc error: code = NotFound desc = could not find container \"73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c\": container with ID starting with 73fd6abdee5c7cffa71a8fb7fd0ea78442c71ffdb8d8db8ce01475a93b31424c not found: ID does not exist" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.240282 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.254874 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:52 crc kubenswrapper[4954]: E1127 17:02:52.255363 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.255387 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" Nov 27 17:02:52 crc kubenswrapper[4954]: E1127 17:02:52.255402 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.255410 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.255661 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-log" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.255685 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" containerName="nova-metadata-metadata" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.256986 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.259488 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.259854 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.267277 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.392784 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959jb\" (UniqueName: \"kubernetes.io/projected/51632054-40fc-42a7-b633-e1e35143689f-kube-api-access-959jb\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.393101 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-config-data\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.393530 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51632054-40fc-42a7-b633-e1e35143689f-logs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.393949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.394047 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.495593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.495655 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.495717 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959jb\" (UniqueName: \"kubernetes.io/projected/51632054-40fc-42a7-b633-e1e35143689f-kube-api-access-959jb\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.495755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-config-data\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.495826 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51632054-40fc-42a7-b633-e1e35143689f-logs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.496240 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51632054-40fc-42a7-b633-e1e35143689f-logs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.499612 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.500207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-config-data\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.502445 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/51632054-40fc-42a7-b633-e1e35143689f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.514696 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959jb\" (UniqueName: \"kubernetes.io/projected/51632054-40fc-42a7-b633-e1e35143689f-kube-api-access-959jb\") pod \"nova-metadata-0\" (UID: \"51632054-40fc-42a7-b633-e1e35143689f\") " pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.581757 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.677444 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a515f1-708a-4b0a-83ed-d28323eabe4a" path="/var/lib/kubelet/pods/10a515f1-708a-4b0a-83ed-d28323eabe4a/volumes" Nov 27 17:02:52 crc kubenswrapper[4954]: I1127 17:02:52.678543 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="413cbe5a-ca44-4d13-ac32-68ff849a4e41" path="/var/lib/kubelet/pods/413cbe5a-ca44-4d13-ac32-68ff849a4e41/volumes" Nov 27 17:02:53 crc kubenswrapper[4954]: I1127 17:02:53.085450 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 17:02:53 crc kubenswrapper[4954]: W1127 17:02:53.093277 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51632054_40fc_42a7_b633_e1e35143689f.slice/crio-dfff1b86bfd2475a3ae08bc731cb038248cca275965954ab5ca99afd1addb7bc WatchSource:0}: Error finding container dfff1b86bfd2475a3ae08bc731cb038248cca275965954ab5ca99afd1addb7bc: Status 404 returned error can't find the container with id dfff1b86bfd2475a3ae08bc731cb038248cca275965954ab5ca99afd1addb7bc Nov 27 17:02:53 crc kubenswrapper[4954]: I1127 17:02:53.178842 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99c33ed6-9c2c-4eb0-be67-68c19d5479a7","Type":"ContainerStarted","Data":"17972546b410b9ea695d5a8721af8e4a0ff6bd0cf882bc41d84cad0489c3ca1a"} Nov 27 17:02:53 crc kubenswrapper[4954]: I1127 17:02:53.180058 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51632054-40fc-42a7-b633-e1e35143689f","Type":"ContainerStarted","Data":"dfff1b86bfd2475a3ae08bc731cb038248cca275965954ab5ca99afd1addb7bc"} Nov 27 17:02:53 crc kubenswrapper[4954]: I1127 17:02:53.200374 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.200347986 podStartE2EDuration="2.200347986s" podCreationTimestamp="2025-11-27 17:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:53.197011035 +0000 UTC m=+1485.214451345" watchObservedRunningTime="2025-11-27 17:02:53.200347986 +0000 UTC m=+1485.217788296" Nov 27 17:02:54 crc kubenswrapper[4954]: I1127 17:02:54.191068 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51632054-40fc-42a7-b633-e1e35143689f","Type":"ContainerStarted","Data":"fa469ac59b1504b547d88252d61e213234b6882ffa501b8ede46c7a5e0ec20a1"} Nov 27 17:02:54 crc kubenswrapper[4954]: I1127 17:02:54.191372 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51632054-40fc-42a7-b633-e1e35143689f","Type":"ContainerStarted","Data":"3b866bb4a0f5cb70fece377c1808377399bb1e05f41d0933e588e52ca068d739"} Nov 27 17:02:56 crc kubenswrapper[4954]: I1127 17:02:56.557677 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 17:02:57 crc kubenswrapper[4954]: I1127 17:02:57.583236 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:57 crc kubenswrapper[4954]: I1127 17:02:57.583328 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 17:02:59 crc kubenswrapper[4954]: I1127 17:02:59.504559 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:02:59 crc kubenswrapper[4954]: I1127 17:02:59.505051 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 17:03:00 crc kubenswrapper[4954]: I1127 17:03:00.519697 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6662243a-d2bd-4571-8e27-6b923a367942" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:03:00 crc kubenswrapper[4954]: I1127 17:03:00.520421 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6662243a-d2bd-4571-8e27-6b923a367942" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:03:01 crc kubenswrapper[4954]: I1127 17:03:01.557505 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 17:03:01 crc kubenswrapper[4954]: I1127 17:03:01.589333 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 17:03:01 crc kubenswrapper[4954]: I1127 17:03:01.617748 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.617727005 podStartE2EDuration="9.617727005s" podCreationTimestamp="2025-11-27 17:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:02:54.207165733 +0000 UTC m=+1486.224606033" watchObservedRunningTime="2025-11-27 17:03:01.617727005 +0000 UTC m=+1493.635167305" Nov 27 17:03:02 crc kubenswrapper[4954]: I1127 17:03:02.309804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 17:03:02 crc kubenswrapper[4954]: I1127 17:03:02.582402 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 17:03:02 crc kubenswrapper[4954]: I1127 17:03:02.582454 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 17:03:03 crc kubenswrapper[4954]: I1127 17:03:03.594732 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="51632054-40fc-42a7-b633-e1e35143689f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:03:03 crc kubenswrapper[4954]: I1127 17:03:03.594759 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="51632054-40fc-42a7-b633-e1e35143689f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 17:03:07 crc kubenswrapper[4954]: I1127 17:03:07.286704 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 17:03:09 crc kubenswrapper[4954]: I1127 17:03:09.518018 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 17:03:09 crc kubenswrapper[4954]: I1127 17:03:09.518718 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 17:03:09 crc kubenswrapper[4954]: I1127 17:03:09.522552 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 17:03:09 crc kubenswrapper[4954]: I1127 17:03:09.524563 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 17:03:10 crc kubenswrapper[4954]: I1127 17:03:10.350956 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 17:03:10 crc kubenswrapper[4954]: I1127 17:03:10.358920 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 17:03:12 crc kubenswrapper[4954]: I1127 17:03:12.587800 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 17:03:12 crc kubenswrapper[4954]: I1127 17:03:12.591591 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 17:03:12 crc kubenswrapper[4954]: I1127 17:03:12.594226 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 17:03:13 crc kubenswrapper[4954]: I1127 17:03:13.377191 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 17:03:21 crc kubenswrapper[4954]: I1127 17:03:21.525644 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:23 crc kubenswrapper[4954]: I1127 17:03:23.118238 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:26 crc kubenswrapper[4954]: I1127 17:03:26.033004 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="rabbitmq" containerID="cri-o://8d9632c01a56fed6314fe20593b7890b1d092f7c59e288a6f9964fb7ca5853d4" gracePeriod=604796 Nov 27 17:03:27 crc kubenswrapper[4954]: I1127 17:03:27.658225 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="rabbitmq" containerID="cri-o://88858638c29d0cad0c9c5ad394dc99b01a968705eada72845d7517cd148076b7" gracePeriod=604796 Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.552085 4954 generic.go:334] "Generic (PLEG): container finished" podID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerID="8d9632c01a56fed6314fe20593b7890b1d092f7c59e288a6f9964fb7ca5853d4" exitCode=0 Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.552175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerDied","Data":"8d9632c01a56fed6314fe20593b7890b1d092f7c59e288a6f9964fb7ca5853d4"} Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.680278 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806663 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806763 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkkx\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806856 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806926 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806949 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.806996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.807042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.807070 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.807090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.807128 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\" (UID: \"37b16922-ac4b-4c0f-bf9c-444474fe1e08\") " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.807709 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.808771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.808848 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.812506 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx" (OuterVolumeSpecName: "kube-api-access-sxkkx") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "kube-api-access-sxkkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.812594 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.813057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info" (OuterVolumeSpecName: "pod-info") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.814760 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.816620 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.841040 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data" (OuterVolumeSpecName: "config-data") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.895128 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf" (OuterVolumeSpecName: "server-conf") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.905295 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910026 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910079 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910089 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910100 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkkx\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-kube-api-access-sxkkx\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910110 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910119 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910127 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37b16922-ac4b-4c0f-bf9c-444474fe1e08-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910135 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37b16922-ac4b-4c0f-bf9c-444474fe1e08-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910143 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.910151 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37b16922-ac4b-4c0f-bf9c-444474fe1e08-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.938517 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 27 17:03:32 crc kubenswrapper[4954]: I1127 17:03:32.942904 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "37b16922-ac4b-4c0f-bf9c-444474fe1e08" (UID: "37b16922-ac4b-4c0f-bf9c-444474fe1e08"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.012889 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37b16922-ac4b-4c0f-bf9c-444474fe1e08-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.012924 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.564295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37b16922-ac4b-4c0f-bf9c-444474fe1e08","Type":"ContainerDied","Data":"38a9b5751e9900ed4baa2ab57a7c849281afa825b2eac1b101b46d798f857244"} Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.565069 4954 scope.go:117] "RemoveContainer" containerID="8d9632c01a56fed6314fe20593b7890b1d092f7c59e288a6f9964fb7ca5853d4" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.564360 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.587083 4954 scope.go:117] "RemoveContainer" containerID="445f8d4ba8edbf32d835aee9867360a8ad19116e7317ee02107d314c316b88c3" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.603400 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.616243 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.637082 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:33 crc kubenswrapper[4954]: E1127 17:03:33.637456 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="setup-container" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.637476 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="setup-container" Nov 27 17:03:33 crc kubenswrapper[4954]: E1127 17:03:33.637826 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="rabbitmq" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.637835 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="rabbitmq" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.638050 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="rabbitmq" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.639053 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.643004 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sg67g" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.643390 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.643572 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.643832 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.644037 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.644202 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.646567 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.662228 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.724223 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.724520 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e3c0607-0f08-4188-9995-c0a2a253fdc5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.724771 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725068 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e3c0607-0f08-4188-9995-c0a2a253fdc5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725229 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9ths\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-kube-api-access-k9ths\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725348 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725423 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725504 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.725705 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.827969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828068 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828096 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e3c0607-0f08-4188-9995-c0a2a253fdc5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828140 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828211 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e3c0607-0f08-4188-9995-c0a2a253fdc5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828264 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828281 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828309 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ths\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-kube-api-access-k9ths\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828389 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828411 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.828936 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.829011 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.829374 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.829756 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.829766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.829887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e3c0607-0f08-4188-9995-c0a2a253fdc5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.836246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e3c0607-0f08-4188-9995-c0a2a253fdc5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.836271 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e3c0607-0f08-4188-9995-c0a2a253fdc5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.836422 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.842615 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.847334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ths\" (UniqueName: \"kubernetes.io/projected/7e3c0607-0f08-4188-9995-c0a2a253fdc5-kube-api-access-k9ths\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.866599 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7e3c0607-0f08-4188-9995-c0a2a253fdc5\") " pod="openstack/rabbitmq-server-0" Nov 27 17:03:33 crc kubenswrapper[4954]: I1127 17:03:33.995734 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 17:03:34 crc kubenswrapper[4954]: I1127 17:03:34.467217 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 17:03:34 crc kubenswrapper[4954]: I1127 17:03:34.574889 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e3c0607-0f08-4188-9995-c0a2a253fdc5","Type":"ContainerStarted","Data":"9e79eff33b80d0865b03bdcb4232b9fc6e94aaea21026eb14025f51805f161be"} Nov 27 17:03:34 crc kubenswrapper[4954]: I1127 17:03:34.577751 4954 generic.go:334] "Generic (PLEG): container finished" podID="70949f64-380f-4947-a55a-8780126c7ba4" containerID="88858638c29d0cad0c9c5ad394dc99b01a968705eada72845d7517cd148076b7" exitCode=0 Nov 27 17:03:34 crc kubenswrapper[4954]: I1127 17:03:34.577790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerDied","Data":"88858638c29d0cad0c9c5ad394dc99b01a968705eada72845d7517cd148076b7"} Nov 27 17:03:34 crc kubenswrapper[4954]: I1127 17:03:34.675324 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" path="/var/lib/kubelet/pods/37b16922-ac4b-4c0f-bf9c-444474fe1e08/volumes" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.053441 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.159873 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.159934 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160069 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160601 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160745 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160788 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160827 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160959 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.160995 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.161019 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.161045 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsx6\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6\") pod \"70949f64-380f-4947-a55a-8780126c7ba4\" (UID: \"70949f64-380f-4947-a55a-8780126c7ba4\") " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.161472 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.161958 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.162345 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.164520 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.164861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.169817 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.171094 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6" (OuterVolumeSpecName: "kube-api-access-7qsx6") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "kube-api-access-7qsx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.173925 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info" (OuterVolumeSpecName: "pod-info") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263282 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263338 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263348 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70949f64-380f-4947-a55a-8780126c7ba4-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263361 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263372 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70949f64-380f-4947-a55a-8780126c7ba4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263380 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.263388 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qsx6\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-kube-api-access-7qsx6\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.293460 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.365144 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.384432 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data" (OuterVolumeSpecName: "config-data") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.467394 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.488435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.508403 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf" (OuterVolumeSpecName: "server-conf") pod "70949f64-380f-4947-a55a-8780126c7ba4" (UID: "70949f64-380f-4947-a55a-8780126c7ba4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.569552 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70949f64-380f-4947-a55a-8780126c7ba4-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.569607 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70949f64-380f-4947-a55a-8780126c7ba4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.593867 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70949f64-380f-4947-a55a-8780126c7ba4","Type":"ContainerDied","Data":"2da83980add47da06eafdbe9ac1cfdf9a05941ca38fef6bbf81346394596ac87"} Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.593918 4954 scope.go:117] "RemoveContainer" containerID="88858638c29d0cad0c9c5ad394dc99b01a968705eada72845d7517cd148076b7" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.594045 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.649821 4954 scope.go:117] "RemoveContainer" containerID="36df7c7eb591cb47cfc65798bc7acff77ecfcbcf7991a576639fa7c256680166" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.650288 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.667194 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:35 crc kubenswrapper[4954]: E1127 17:03:35.667671 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="rabbitmq" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.667688 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="rabbitmq" Nov 27 17:03:35 crc kubenswrapper[4954]: E1127 17:03:35.667727 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="setup-container" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.667735 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="setup-container" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.667903 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70949f64-380f-4947-a55a-8780126c7ba4" containerName="rabbitmq" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.668929 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.670476 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.712651 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.727301 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.740423 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.742267 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.744443 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.747025 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fpcb4" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.747343 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.747465 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.747613 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.747736 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.748050 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.752693 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781499 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781632 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781819 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781876 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.781912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9wm\" (UniqueName: \"kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.782063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884042 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884113 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f048cd15-3583-44fd-a9ca-1288e89f29b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884362 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884396 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884420 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9wm\" (UniqueName: \"kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884570 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f048cd15-3583-44fd-a9ca-1288e89f29b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.884995 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.885057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.885156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.885433 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.885888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.886019 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.886057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.886124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.886457 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.887167 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.887242 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.887349 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2sbs\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-kube-api-access-k2sbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.887422 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.903918 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9wm\" (UniqueName: \"kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm\") pod \"dnsmasq-dns-5576978c7c-9fl8h\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.990181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f048cd15-3583-44fd-a9ca-1288e89f29b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.990240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.990277 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.990346 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991391 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991570 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991656 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f048cd15-3583-44fd-a9ca-1288e89f29b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991844 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991853 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991867 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2sbs\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-kube-api-access-k2sbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991911 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.991980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.992111 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.992560 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.993368 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f048cd15-3583-44fd-a9ca-1288e89f29b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.993993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.995231 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:35 crc kubenswrapper[4954]: I1127 17:03:35.999842 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f048cd15-3583-44fd-a9ca-1288e89f29b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.005097 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f048cd15-3583-44fd-a9ca-1288e89f29b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.009252 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2sbs\" (UniqueName: \"kubernetes.io/projected/f048cd15-3583-44fd-a9ca-1288e89f29b3-kube-api-access-k2sbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.034193 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f048cd15-3583-44fd-a9ca-1288e89f29b3\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.137457 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.144108 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.611004 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e3c0607-0f08-4188-9995-c0a2a253fdc5","Type":"ContainerStarted","Data":"39cc7570e7b6154485402e09224e184169e39b9e0ac7bc52856e853cb29291d0"} Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.652717 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.692104 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70949f64-380f-4947-a55a-8780126c7ba4" path="/var/lib/kubelet/pods/70949f64-380f-4947-a55a-8780126c7ba4/volumes" Nov 27 17:03:36 crc kubenswrapper[4954]: I1127 17:03:36.693088 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 17:03:37 crc kubenswrapper[4954]: I1127 17:03:37.575410 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="37b16922-ac4b-4c0f-bf9c-444474fe1e08" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: i/o timeout" Nov 27 17:03:37 crc kubenswrapper[4954]: I1127 17:03:37.621536 4954 generic.go:334] "Generic (PLEG): container finished" podID="9772f892-39af-4e16-893a-6237b1b33fe7" containerID="5a199977f31bd848209b426bf5755e912a4ea40e4bd3c88afc0aff8e0e67a491" exitCode=0 Nov 27 17:03:37 crc kubenswrapper[4954]: I1127 17:03:37.621614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" event={"ID":"9772f892-39af-4e16-893a-6237b1b33fe7","Type":"ContainerDied","Data":"5a199977f31bd848209b426bf5755e912a4ea40e4bd3c88afc0aff8e0e67a491"} Nov 27 17:03:37 crc kubenswrapper[4954]: I1127 17:03:37.621637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" event={"ID":"9772f892-39af-4e16-893a-6237b1b33fe7","Type":"ContainerStarted","Data":"36a91371ef368feced9ca0d7751cc2ab3328d60fabd288931ffcbc5e6ba732c9"} Nov 27 17:03:37 crc kubenswrapper[4954]: I1127 17:03:37.622507 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f048cd15-3583-44fd-a9ca-1288e89f29b3","Type":"ContainerStarted","Data":"9f6fd79b47eb542a65582217d1421920b32023ae4c333649e91d7362ec150875"} Nov 27 17:03:38 crc kubenswrapper[4954]: I1127 17:03:38.633522 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f048cd15-3583-44fd-a9ca-1288e89f29b3","Type":"ContainerStarted","Data":"fa0f81d7f078b307427cce2292190b4a9d527fe04416e469893a07decb58478d"} Nov 27 17:03:38 crc kubenswrapper[4954]: I1127 17:03:38.635767 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" event={"ID":"9772f892-39af-4e16-893a-6237b1b33fe7","Type":"ContainerStarted","Data":"ca86ca7d32dc7c9fbfc0641f181729455868f372efe1d87c9a3892fcd16ce642"} Nov 27 17:03:38 crc kubenswrapper[4954]: I1127 17:03:38.635937 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:38 crc kubenswrapper[4954]: I1127 17:03:38.682702 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" podStartSLOduration=3.682679061 podStartE2EDuration="3.682679061s" podCreationTimestamp="2025-11-27 17:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:03:38.679532115 +0000 UTC m=+1530.696972425" watchObservedRunningTime="2025-11-27 17:03:38.682679061 +0000 UTC m=+1530.700119361" Nov 27 17:03:43 crc kubenswrapper[4954]: I1127 17:03:43.812605 4954 scope.go:117] "RemoveContainer" containerID="b1f2eaab9a5ed1abaf1286b1d6949f8f8e96735d163d7d6922e3cdaa7fabe3a4" Nov 27 17:03:43 crc kubenswrapper[4954]: I1127 17:03:43.860182 4954 scope.go:117] "RemoveContainer" containerID="3ad4afc8238191c70f11816ed77f9229615fb78f43db6c58aa1b400f777eb469" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.139593 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.248977 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.249368 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="dnsmasq-dns" containerID="cri-o://9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777" gracePeriod=10 Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.393728 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.199:5353: connect: connection refused" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.399107 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-7klpn"] Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.408566 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.417751 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-7klpn"] Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.518768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7jkv\" (UniqueName: \"kubernetes.io/projected/b4e436ab-fb96-4213-be44-d08f62fa30ef-kube-api-access-q7jkv\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.518832 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.518865 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.518921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-config\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.518939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.519005 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.519044 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.624819 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7jkv\" (UniqueName: \"kubernetes.io/projected/b4e436ab-fb96-4213-be44-d08f62fa30ef-kube-api-access-q7jkv\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625204 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625241 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-config\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625325 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.625468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.626323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.626629 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.629125 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.629202 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-config\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.630216 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.630423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4e436ab-fb96-4213-be44-d08f62fa30ef-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.650690 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7jkv\" (UniqueName: \"kubernetes.io/projected/b4e436ab-fb96-4213-be44-d08f62fa30ef-kube-api-access-q7jkv\") pod \"dnsmasq-dns-8c6f6df99-7klpn\" (UID: \"b4e436ab-fb96-4213-be44-d08f62fa30ef\") " pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.778065 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:46 crc kubenswrapper[4954]: I1127 17:03:46.924111 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037033 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037087 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037220 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037365 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037527 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.037569 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpl65\" (UniqueName: \"kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65\") pod \"e37726ba-6010-4e19-a3ad-df091a9cc21e\" (UID: \"e37726ba-6010-4e19-a3ad-df091a9cc21e\") " Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.042600 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65" (OuterVolumeSpecName: "kube-api-access-rpl65") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "kube-api-access-rpl65". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.104552 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.109446 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.110537 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config" (OuterVolumeSpecName: "config") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.129235 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.137966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e37726ba-6010-4e19-a3ad-df091a9cc21e" (UID: "e37726ba-6010-4e19-a3ad-df091a9cc21e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139724 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139754 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139768 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpl65\" (UniqueName: \"kubernetes.io/projected/e37726ba-6010-4e19-a3ad-df091a9cc21e-kube-api-access-rpl65\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139782 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139792 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.139804 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e37726ba-6010-4e19-a3ad-df091a9cc21e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.259994 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-7klpn"] Nov 27 17:03:47 crc kubenswrapper[4954]: W1127 17:03:47.264426 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4e436ab_fb96_4213_be44_d08f62fa30ef.slice/crio-d0874d92807b5c74a4c3af7733348c60d7fe2b56a328654c5e71a20eed2e385a WatchSource:0}: Error finding container d0874d92807b5c74a4c3af7733348c60d7fe2b56a328654c5e71a20eed2e385a: Status 404 returned error can't find the container with id d0874d92807b5c74a4c3af7733348c60d7fe2b56a328654c5e71a20eed2e385a Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.288256 4954 generic.go:334] "Generic (PLEG): container finished" podID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerID="9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777" exitCode=0 Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.288334 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.288336 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" event={"ID":"e37726ba-6010-4e19-a3ad-df091a9cc21e","Type":"ContainerDied","Data":"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777"} Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.288458 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-gzs8b" event={"ID":"e37726ba-6010-4e19-a3ad-df091a9cc21e","Type":"ContainerDied","Data":"aac5be05547a599353270241eb7c2f0ba7487c7deb5ce18363735590ada3bd4f"} Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.288490 4954 scope.go:117] "RemoveContainer" containerID="9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.290908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" event={"ID":"b4e436ab-fb96-4213-be44-d08f62fa30ef","Type":"ContainerStarted","Data":"d0874d92807b5c74a4c3af7733348c60d7fe2b56a328654c5e71a20eed2e385a"} Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.310521 4954 scope.go:117] "RemoveContainer" containerID="21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.408671 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.437758 4954 scope.go:117] "RemoveContainer" containerID="9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777" Nov 27 17:03:47 crc kubenswrapper[4954]: E1127 17:03:47.438668 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777\": container with ID starting with 9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777 not found: ID does not exist" containerID="9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.438701 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777"} err="failed to get container status \"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777\": rpc error: code = NotFound desc = could not find container \"9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777\": container with ID starting with 9e388b9dde084498139eb5f06557f4a4326a986ffca26a22961b5038c2e0f777 not found: ID does not exist" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.438721 4954 scope.go:117] "RemoveContainer" containerID="21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61" Nov 27 17:03:47 crc kubenswrapper[4954]: E1127 17:03:47.444749 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61\": container with ID starting with 21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61 not found: ID does not exist" containerID="21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.444805 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61"} err="failed to get container status \"21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61\": rpc error: code = NotFound desc = could not find container \"21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61\": container with ID starting with 21d6af5bd61055f61f54e926524246adc4bf2f3be12548a77a141fc43ce85a61 not found: ID does not exist" Nov 27 17:03:47 crc kubenswrapper[4954]: I1127 17:03:47.465684 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-gzs8b"] Nov 27 17:03:48 crc kubenswrapper[4954]: I1127 17:03:48.300008 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4e436ab-fb96-4213-be44-d08f62fa30ef" containerID="d606d9f3e36aed8f4001738ef4a39d0c000d4cb240f969c0d25811eab1b553c2" exitCode=0 Nov 27 17:03:48 crc kubenswrapper[4954]: I1127 17:03:48.300174 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" event={"ID":"b4e436ab-fb96-4213-be44-d08f62fa30ef","Type":"ContainerDied","Data":"d606d9f3e36aed8f4001738ef4a39d0c000d4cb240f969c0d25811eab1b553c2"} Nov 27 17:03:48 crc kubenswrapper[4954]: I1127 17:03:48.672202 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" path="/var/lib/kubelet/pods/e37726ba-6010-4e19-a3ad-df091a9cc21e/volumes" Nov 27 17:03:49 crc kubenswrapper[4954]: I1127 17:03:49.333354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" event={"ID":"b4e436ab-fb96-4213-be44-d08f62fa30ef","Type":"ContainerStarted","Data":"7066397fa411ff402ac3c556e2fef699f22bd9af65975d2c3a15d5f7576ebee6"} Nov 27 17:03:49 crc kubenswrapper[4954]: I1127 17:03:49.333859 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:49 crc kubenswrapper[4954]: I1127 17:03:49.361698 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" podStartSLOduration=3.3616776919999998 podStartE2EDuration="3.361677692s" podCreationTimestamp="2025-11-27 17:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:03:49.354785795 +0000 UTC m=+1541.372226095" watchObservedRunningTime="2025-11-27 17:03:49.361677692 +0000 UTC m=+1541.379117992" Nov 27 17:03:56 crc kubenswrapper[4954]: I1127 17:03:56.780645 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-7klpn" Nov 27 17:03:56 crc kubenswrapper[4954]: I1127 17:03:56.891525 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:56 crc kubenswrapper[4954]: I1127 17:03:56.892016 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="dnsmasq-dns" containerID="cri-o://ca86ca7d32dc7c9fbfc0641f181729455868f372efe1d87c9a3892fcd16ce642" gracePeriod=10 Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.403743 4954 generic.go:334] "Generic (PLEG): container finished" podID="9772f892-39af-4e16-893a-6237b1b33fe7" containerID="ca86ca7d32dc7c9fbfc0641f181729455868f372efe1d87c9a3892fcd16ce642" exitCode=0 Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.403835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" event={"ID":"9772f892-39af-4e16-893a-6237b1b33fe7","Type":"ContainerDied","Data":"ca86ca7d32dc7c9fbfc0641f181729455868f372efe1d87c9a3892fcd16ce642"} Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.925532 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.977823 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978186 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978242 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978281 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr9wm\" (UniqueName: \"kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978312 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978328 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.978382 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc\") pod \"9772f892-39af-4e16-893a-6237b1b33fe7\" (UID: \"9772f892-39af-4e16-893a-6237b1b33fe7\") " Nov 27 17:03:57 crc kubenswrapper[4954]: I1127 17:03:57.985339 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm" (OuterVolumeSpecName: "kube-api-access-dr9wm") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "kube-api-access-dr9wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.042869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.045511 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.063265 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.069515 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config" (OuterVolumeSpecName: "config") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.075393 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.080985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9772f892-39af-4e16-893a-6237b1b33fe7" (UID: "9772f892-39af-4e16-893a-6237b1b33fe7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081238 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081261 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081273 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081281 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081291 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081300 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9772f892-39af-4e16-893a-6237b1b33fe7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.081308 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr9wm\" (UniqueName: \"kubernetes.io/projected/9772f892-39af-4e16-893a-6237b1b33fe7-kube-api-access-dr9wm\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.413708 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" event={"ID":"9772f892-39af-4e16-893a-6237b1b33fe7","Type":"ContainerDied","Data":"36a91371ef368feced9ca0d7751cc2ab3328d60fabd288931ffcbc5e6ba732c9"} Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.413769 4954 scope.go:117] "RemoveContainer" containerID="ca86ca7d32dc7c9fbfc0641f181729455868f372efe1d87c9a3892fcd16ce642" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.413771 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-9fl8h" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.449743 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.450219 4954 scope.go:117] "RemoveContainer" containerID="5a199977f31bd848209b426bf5755e912a4ea40e4bd3c88afc0aff8e0e67a491" Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.457557 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-9fl8h"] Nov 27 17:03:58 crc kubenswrapper[4954]: I1127 17:03:58.673954 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" path="/var/lib/kubelet/pods/9772f892-39af-4e16-893a-6237b1b33fe7/volumes" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.008349 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:00 crc kubenswrapper[4954]: E1127 17:04:00.010149 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="init" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.010266 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="init" Nov 27 17:04:00 crc kubenswrapper[4954]: E1127 17:04:00.010356 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.010434 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: E1127 17:04:00.010517 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="init" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.010616 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="init" Nov 27 17:04:00 crc kubenswrapper[4954]: E1127 17:04:00.010704 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.010778 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.012755 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9772f892-39af-4e16-893a-6237b1b33fe7" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.012935 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37726ba-6010-4e19-a3ad-df091a9cc21e" containerName="dnsmasq-dns" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.030076 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.030225 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.219511 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.219563 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzkf\" (UniqueName: \"kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.219840 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.322222 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.322272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzkf\" (UniqueName: \"kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.322320 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.322812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.322851 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.358843 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzkf\" (UniqueName: \"kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf\") pod \"redhat-marketplace-6n4gh\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.362845 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:00 crc kubenswrapper[4954]: W1127 17:04:00.844952 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cdc7fc9_8989_4cba_b760_58deda652214.slice/crio-bffd63582d610acde602c3791a30b2aca916ee6fa54f43938271b06d69578ba2 WatchSource:0}: Error finding container bffd63582d610acde602c3791a30b2aca916ee6fa54f43938271b06d69578ba2: Status 404 returned error can't find the container with id bffd63582d610acde602c3791a30b2aca916ee6fa54f43938271b06d69578ba2 Nov 27 17:04:00 crc kubenswrapper[4954]: I1127 17:04:00.845254 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:01 crc kubenswrapper[4954]: I1127 17:04:01.445712 4954 generic.go:334] "Generic (PLEG): container finished" podID="3cdc7fc9-8989-4cba-b760-58deda652214" containerID="dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c" exitCode=0 Nov 27 17:04:01 crc kubenswrapper[4954]: I1127 17:04:01.445779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerDied","Data":"dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c"} Nov 27 17:04:01 crc kubenswrapper[4954]: I1127 17:04:01.446924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerStarted","Data":"bffd63582d610acde602c3791a30b2aca916ee6fa54f43938271b06d69578ba2"} Nov 27 17:04:04 crc kubenswrapper[4954]: I1127 17:04:04.473623 4954 generic.go:334] "Generic (PLEG): container finished" podID="3cdc7fc9-8989-4cba-b760-58deda652214" containerID="891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f" exitCode=0 Nov 27 17:04:04 crc kubenswrapper[4954]: I1127 17:04:04.473693 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerDied","Data":"891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f"} Nov 27 17:04:06 crc kubenswrapper[4954]: I1127 17:04:06.493079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerStarted","Data":"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51"} Nov 27 17:04:07 crc kubenswrapper[4954]: I1127 17:04:07.533634 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6n4gh" podStartSLOduration=3.889268146 podStartE2EDuration="8.533610508s" podCreationTimestamp="2025-11-27 17:03:59 +0000 UTC" firstStartedPulling="2025-11-27 17:04:01.44801698 +0000 UTC m=+1553.465457290" lastFinishedPulling="2025-11-27 17:04:06.092359322 +0000 UTC m=+1558.109799652" observedRunningTime="2025-11-27 17:04:07.51965801 +0000 UTC m=+1559.537098330" watchObservedRunningTime="2025-11-27 17:04:07.533610508 +0000 UTC m=+1559.551050818" Nov 27 17:04:08 crc kubenswrapper[4954]: I1127 17:04:08.514353 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e3c0607-0f08-4188-9995-c0a2a253fdc5" containerID="39cc7570e7b6154485402e09224e184169e39b9e0ac7bc52856e853cb29291d0" exitCode=0 Nov 27 17:04:08 crc kubenswrapper[4954]: I1127 17:04:08.514403 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e3c0607-0f08-4188-9995-c0a2a253fdc5","Type":"ContainerDied","Data":"39cc7570e7b6154485402e09224e184169e39b9e0ac7bc52856e853cb29291d0"} Nov 27 17:04:09 crc kubenswrapper[4954]: I1127 17:04:09.523867 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e3c0607-0f08-4188-9995-c0a2a253fdc5","Type":"ContainerStarted","Data":"27bb3aa7f3305b95e52af0451d08c0b8aed5a61957763ce0de37b8f9d7ca26fb"} Nov 27 17:04:09 crc kubenswrapper[4954]: I1127 17:04:09.524417 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 17:04:09 crc kubenswrapper[4954]: I1127 17:04:09.552168 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.552139199 podStartE2EDuration="36.552139199s" podCreationTimestamp="2025-11-27 17:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:04:09.541947071 +0000 UTC m=+1561.559387371" watchObservedRunningTime="2025-11-27 17:04:09.552139199 +0000 UTC m=+1561.569579529" Nov 27 17:04:10 crc kubenswrapper[4954]: I1127 17:04:10.364059 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:10 crc kubenswrapper[4954]: I1127 17:04:10.364128 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:10 crc kubenswrapper[4954]: I1127 17:04:10.421442 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:10 crc kubenswrapper[4954]: I1127 17:04:10.534285 4954 generic.go:334] "Generic (PLEG): container finished" podID="f048cd15-3583-44fd-a9ca-1288e89f29b3" containerID="fa0f81d7f078b307427cce2292190b4a9d527fe04416e469893a07decb58478d" exitCode=0 Nov 27 17:04:10 crc kubenswrapper[4954]: I1127 17:04:10.535123 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f048cd15-3583-44fd-a9ca-1288e89f29b3","Type":"ContainerDied","Data":"fa0f81d7f078b307427cce2292190b4a9d527fe04416e469893a07decb58478d"} Nov 27 17:04:11 crc kubenswrapper[4954]: I1127 17:04:11.545323 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f048cd15-3583-44fd-a9ca-1288e89f29b3","Type":"ContainerStarted","Data":"1b662bf7a1ca7b20cca5236d59d84278e80fd58cdbad13c035c8eab9319a1d4f"} Nov 27 17:04:11 crc kubenswrapper[4954]: I1127 17:04:11.545526 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:04:11 crc kubenswrapper[4954]: I1127 17:04:11.578849 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.578830647 podStartE2EDuration="36.578830647s" podCreationTimestamp="2025-11-27 17:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:04:11.570388362 +0000 UTC m=+1563.587828682" watchObservedRunningTime="2025-11-27 17:04:11.578830647 +0000 UTC m=+1563.596270947" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.255171 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6"] Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.256913 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.262487 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.262514 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.262496 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.264327 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.268286 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6"] Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.435534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.435670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.435723 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqw5\" (UniqueName: \"kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.435785 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.538298 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.538406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.538447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqw5\" (UniqueName: \"kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.538470 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.547147 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.547903 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.548276 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.567312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqw5\" (UniqueName: \"kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:13 crc kubenswrapper[4954]: I1127 17:04:13.625221 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:04:14 crc kubenswrapper[4954]: I1127 17:04:14.190649 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6"] Nov 27 17:04:14 crc kubenswrapper[4954]: W1127 17:04:14.204362 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd294865e_7999_4e81_818f_3a5db24b7f01.slice/crio-d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df WatchSource:0}: Error finding container d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df: Status 404 returned error can't find the container with id d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df Nov 27 17:04:14 crc kubenswrapper[4954]: I1127 17:04:14.576989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" event={"ID":"d294865e-7999-4e81-818f-3a5db24b7f01","Type":"ContainerStarted","Data":"d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df"} Nov 27 17:04:20 crc kubenswrapper[4954]: I1127 17:04:20.416451 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:20 crc kubenswrapper[4954]: I1127 17:04:20.468357 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:20 crc kubenswrapper[4954]: I1127 17:04:20.664687 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6n4gh" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="registry-server" containerID="cri-o://b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51" gracePeriod=2 Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.241772 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.408912 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities\") pod \"3cdc7fc9-8989-4cba-b760-58deda652214\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.408981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzkf\" (UniqueName: \"kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf\") pod \"3cdc7fc9-8989-4cba-b760-58deda652214\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.409066 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content\") pod \"3cdc7fc9-8989-4cba-b760-58deda652214\" (UID: \"3cdc7fc9-8989-4cba-b760-58deda652214\") " Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.409756 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities" (OuterVolumeSpecName: "utilities") pod "3cdc7fc9-8989-4cba-b760-58deda652214" (UID: "3cdc7fc9-8989-4cba-b760-58deda652214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.415895 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf" (OuterVolumeSpecName: "kube-api-access-wvzkf") pod "3cdc7fc9-8989-4cba-b760-58deda652214" (UID: "3cdc7fc9-8989-4cba-b760-58deda652214"). InnerVolumeSpecName "kube-api-access-wvzkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.438166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cdc7fc9-8989-4cba-b760-58deda652214" (UID: "3cdc7fc9-8989-4cba-b760-58deda652214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.511248 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.511294 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzkf\" (UniqueName: \"kubernetes.io/projected/3cdc7fc9-8989-4cba-b760-58deda652214-kube-api-access-wvzkf\") on node \"crc\" DevicePath \"\"" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.511309 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdc7fc9-8989-4cba-b760-58deda652214-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.689174 4954 generic.go:334] "Generic (PLEG): container finished" podID="3cdc7fc9-8989-4cba-b760-58deda652214" containerID="b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51" exitCode=0 Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.689229 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerDied","Data":"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51"} Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.689273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n4gh" event={"ID":"3cdc7fc9-8989-4cba-b760-58deda652214","Type":"ContainerDied","Data":"bffd63582d610acde602c3791a30b2aca916ee6fa54f43938271b06d69578ba2"} Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.689296 4954 scope.go:117] "RemoveContainer" containerID="b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.689304 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n4gh" Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.740966 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:21 crc kubenswrapper[4954]: I1127 17:04:21.753119 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n4gh"] Nov 27 17:04:22 crc kubenswrapper[4954]: I1127 17:04:22.677865 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" path="/var/lib/kubelet/pods/3cdc7fc9-8989-4cba-b760-58deda652214/volumes" Nov 27 17:04:24 crc kubenswrapper[4954]: I1127 17:04:24.001308 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7e3c0607-0f08-4188-9995-c0a2a253fdc5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.206:5671: connect: connection refused" Nov 27 17:04:26 crc kubenswrapper[4954]: I1127 17:04:26.147710 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f048cd15-3583-44fd-a9ca-1288e89f29b3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.208:5671: connect: connection refused" Nov 27 17:04:30 crc kubenswrapper[4954]: I1127 17:04:30.970149 4954 scope.go:117] "RemoveContainer" containerID="891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.395455 4954 scope.go:117] "RemoveContainer" containerID="dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.417474 4954 scope.go:117] "RemoveContainer" containerID="b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.418117 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51\": container with ID starting with b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51 not found: ID does not exist" containerID="b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.418226 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51"} err="failed to get container status \"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51\": rpc error: code = NotFound desc = could not find container \"b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51\": container with ID starting with b6cbc43ce42d98093a64267705dec58b4a58bf84a2d25d8662221f2d45c2be51 not found: ID does not exist" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.418256 4954 scope.go:117] "RemoveContainer" containerID="891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.418742 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f\": container with ID starting with 891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f not found: ID does not exist" containerID="891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.418772 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f"} err="failed to get container status \"891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f\": rpc error: code = NotFound desc = could not find container \"891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f\": container with ID starting with 891c0c96907f48c1515c39d0591318aca0d8b3cdc8ef1fafb19812a9744a9f2f not found: ID does not exist" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.418787 4954 scope.go:117] "RemoveContainer" containerID="dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.421559 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c\": container with ID starting with dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c not found: ID does not exist" containerID="dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c" Nov 27 17:04:31 crc kubenswrapper[4954]: I1127 17:04:31.421642 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c"} err="failed to get container status \"dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c\": rpc error: code = NotFound desc = could not find container \"dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c\": container with ID starting with dd76f70c213d4bbc643ffe7612d63a6ef45150806da8493066da5bb0bee7289c not found: ID does not exist" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.426934 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:901737348e6f67801f501bd827d91ec7f9e8d6cd" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.427020 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:901737348e6f67801f501bd827d91ec7f9e8d6cd" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.427237 4954 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 27 17:04:31 crc kubenswrapper[4954]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:901737348e6f67801f501bd827d91ec7f9e8d6cd,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Nov 27 17:04:31 crc kubenswrapper[4954]: - hosts: all Nov 27 17:04:31 crc kubenswrapper[4954]: strategy: linear Nov 27 17:04:31 crc kubenswrapper[4954]: tasks: Nov 27 17:04:31 crc kubenswrapper[4954]: - name: Enable podified-repos Nov 27 17:04:31 crc kubenswrapper[4954]: become: true Nov 27 17:04:31 crc kubenswrapper[4954]: ansible.builtin.shell: | Nov 27 17:04:31 crc kubenswrapper[4954]: set -euxo pipefail Nov 27 17:04:31 crc kubenswrapper[4954]: pushd /var/tmp Nov 27 17:04:31 crc kubenswrapper[4954]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Nov 27 17:04:31 crc kubenswrapper[4954]: pushd repo-setup-main Nov 27 17:04:31 crc kubenswrapper[4954]: python3 -m venv ./venv Nov 27 17:04:31 crc kubenswrapper[4954]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Nov 27 17:04:31 crc kubenswrapper[4954]: ./venv/bin/repo-setup current-podified -b antelope Nov 27 17:04:31 crc kubenswrapper[4954]: popd Nov 27 17:04:31 crc kubenswrapper[4954]: rm -rf repo-setup-main Nov 27 17:04:31 crc kubenswrapper[4954]: Nov 27 17:04:31 crc kubenswrapper[4954]: Nov 27 17:04:31 crc kubenswrapper[4954]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Nov 27 17:04:31 crc kubenswrapper[4954]: edpm_override_hosts: openstack-edpm-ipam Nov 27 17:04:31 crc kubenswrapper[4954]: edpm_service_type: repo-setup Nov 27 17:04:31 crc kubenswrapper[4954]: Nov 27 17:04:31 crc kubenswrapper[4954]: Nov 27 17:04:31 crc kubenswrapper[4954]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bqw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6_openstack(d294865e-7999-4e81-818f-3a5db24b7f01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Nov 27 17:04:31 crc kubenswrapper[4954]: > logger="UnhandledError" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.428435 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" podUID="d294865e-7999-4e81-818f-3a5db24b7f01" Nov 27 17:04:31 crc kubenswrapper[4954]: E1127 17:04:31.823926 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:901737348e6f67801f501bd827d91ec7f9e8d6cd\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" podUID="d294865e-7999-4e81-818f-3a5db24b7f01" Nov 27 17:04:33 crc kubenswrapper[4954]: I1127 17:04:33.998903 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 17:04:36 crc kubenswrapper[4954]: I1127 17:04:36.146844 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 17:04:44 crc kubenswrapper[4954]: I1127 17:04:44.055633 4954 scope.go:117] "RemoveContainer" containerID="0a51db165465237cd70da4ca6ba3d8a74d92122e18e8c571d1003572e6232564" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.518791 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:04:49 crc kubenswrapper[4954]: E1127 17:04:49.520352 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="extract-utilities" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.520374 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="extract-utilities" Nov 27 17:04:49 crc kubenswrapper[4954]: E1127 17:04:49.520428 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="registry-server" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.520437 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="registry-server" Nov 27 17:04:49 crc kubenswrapper[4954]: E1127 17:04:49.520472 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="extract-content" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.520482 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="extract-content" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.520780 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cdc7fc9-8989-4cba-b760-58deda652214" containerName="registry-server" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.522875 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.535934 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.659210 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.659747 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9n9\" (UniqueName: \"kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.659840 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.761361 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.761512 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9n9\" (UniqueName: \"kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.761543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.762233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.762285 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.790074 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9n9\" (UniqueName: \"kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9\") pod \"community-operators-b2vzl\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:49 crc kubenswrapper[4954]: I1127 17:04:49.857309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:51.893108 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:51.896514 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:51.912628 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.011052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.011145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.011179 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzq5p\" (UniqueName: \"kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.113269 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.113363 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.113385 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzq5p\" (UniqueName: \"kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.114699 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.115110 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.140004 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzq5p\" (UniqueName: \"kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p\") pod \"certified-operators-5r7q5\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.299893 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.384094 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:04:52 crc kubenswrapper[4954]: W1127 17:04:52.393881 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939a6c80_06c3_49eb_8224_f39cd7998055.slice/crio-47f821eb0674f595f3af1055c9bb1c3585fa8238a82d7417823347cfc9b5ccea WatchSource:0}: Error finding container 47f821eb0674f595f3af1055c9bb1c3585fa8238a82d7417823347cfc9b5ccea: Status 404 returned error can't find the container with id 47f821eb0674f595f3af1055c9bb1c3585fa8238a82d7417823347cfc9b5ccea Nov 27 17:04:52 crc kubenswrapper[4954]: I1127 17:04:52.878258 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.068961 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerStarted","Data":"295e3135fa044dc2b3e7efb68c8a2636412809f584b8bbdc6f73f9db2b347d7f"} Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.071698 4954 generic.go:334] "Generic (PLEG): container finished" podID="939a6c80-06c3-49eb-8224-f39cd7998055" containerID="16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1" exitCode=0 Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.071847 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerDied","Data":"16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1"} Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.071940 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerStarted","Data":"47f821eb0674f595f3af1055c9bb1c3585fa8238a82d7417823347cfc9b5ccea"} Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.076443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" event={"ID":"d294865e-7999-4e81-818f-3a5db24b7f01","Type":"ContainerStarted","Data":"d37d6b75ccb132d76247234f9ae346a080654c7698167c4c86fa72c7aa4570ed"} Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.121366 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" podStartSLOduration=2.5183676889999997 podStartE2EDuration="40.121340823s" podCreationTimestamp="2025-11-27 17:04:13 +0000 UTC" firstStartedPulling="2025-11-27 17:04:14.207051117 +0000 UTC m=+1566.224491417" lastFinishedPulling="2025-11-27 17:04:51.810024221 +0000 UTC m=+1603.827464551" observedRunningTime="2025-11-27 17:04:53.116007875 +0000 UTC m=+1605.133448185" watchObservedRunningTime="2025-11-27 17:04:53.121340823 +0000 UTC m=+1605.138781143" Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.687736 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:04:53 crc kubenswrapper[4954]: I1127 17:04:53.688227 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:04:54 crc kubenswrapper[4954]: I1127 17:04:54.087720 4954 generic.go:334] "Generic (PLEG): container finished" podID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerID="6e7413aadb5285b731bf044659905ca287a9536bf3e56ccb8b996e9579620330" exitCode=0 Nov 27 17:04:54 crc kubenswrapper[4954]: I1127 17:04:54.087776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerDied","Data":"6e7413aadb5285b731bf044659905ca287a9536bf3e56ccb8b996e9579620330"} Nov 27 17:04:55 crc kubenswrapper[4954]: I1127 17:04:55.103662 4954 generic.go:334] "Generic (PLEG): container finished" podID="939a6c80-06c3-49eb-8224-f39cd7998055" containerID="2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d" exitCode=0 Nov 27 17:04:55 crc kubenswrapper[4954]: I1127 17:04:55.103677 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerDied","Data":"2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d"} Nov 27 17:04:59 crc kubenswrapper[4954]: I1127 17:04:59.142146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerStarted","Data":"79ff2f66c253bb94c6da5d07289766bdfc43de026c3aaf939d6244a86ab9cf81"} Nov 27 17:05:00 crc kubenswrapper[4954]: I1127 17:05:00.159162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerStarted","Data":"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7"} Nov 27 17:05:00 crc kubenswrapper[4954]: I1127 17:05:00.164148 4954 generic.go:334] "Generic (PLEG): container finished" podID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerID="79ff2f66c253bb94c6da5d07289766bdfc43de026c3aaf939d6244a86ab9cf81" exitCode=0 Nov 27 17:05:00 crc kubenswrapper[4954]: I1127 17:05:00.164179 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerDied","Data":"79ff2f66c253bb94c6da5d07289766bdfc43de026c3aaf939d6244a86ab9cf81"} Nov 27 17:05:00 crc kubenswrapper[4954]: I1127 17:05:00.193161 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2vzl" podStartSLOduration=4.878682798 podStartE2EDuration="11.193136743s" podCreationTimestamp="2025-11-27 17:04:49 +0000 UTC" firstStartedPulling="2025-11-27 17:04:53.074004065 +0000 UTC m=+1605.091444375" lastFinishedPulling="2025-11-27 17:04:59.38845802 +0000 UTC m=+1611.405898320" observedRunningTime="2025-11-27 17:05:00.177939373 +0000 UTC m=+1612.195379663" watchObservedRunningTime="2025-11-27 17:05:00.193136743 +0000 UTC m=+1612.210577043" Nov 27 17:05:03 crc kubenswrapper[4954]: I1127 17:05:03.192935 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerStarted","Data":"b7251c08e7f13963283e0e97cff136a581062a08250df0fcf78b8a94988f13ff"} Nov 27 17:05:03 crc kubenswrapper[4954]: I1127 17:05:03.224032 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5r7q5" podStartSLOduration=3.735226701 podStartE2EDuration="12.224009658s" podCreationTimestamp="2025-11-27 17:04:51 +0000 UTC" firstStartedPulling="2025-11-27 17:04:54.097274224 +0000 UTC m=+1606.114714514" lastFinishedPulling="2025-11-27 17:05:02.586057171 +0000 UTC m=+1614.603497471" observedRunningTime="2025-11-27 17:05:03.21422955 +0000 UTC m=+1615.231669890" watchObservedRunningTime="2025-11-27 17:05:03.224009658 +0000 UTC m=+1615.241449968" Nov 27 17:05:05 crc kubenswrapper[4954]: I1127 17:05:05.214109 4954 generic.go:334] "Generic (PLEG): container finished" podID="d294865e-7999-4e81-818f-3a5db24b7f01" containerID="d37d6b75ccb132d76247234f9ae346a080654c7698167c4c86fa72c7aa4570ed" exitCode=0 Nov 27 17:05:05 crc kubenswrapper[4954]: I1127 17:05:05.214282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" event={"ID":"d294865e-7999-4e81-818f-3a5db24b7f01","Type":"ContainerDied","Data":"d37d6b75ccb132d76247234f9ae346a080654c7698167c4c86fa72c7aa4570ed"} Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.616474 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.735435 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle\") pod \"d294865e-7999-4e81-818f-3a5db24b7f01\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.735483 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bqw5\" (UniqueName: \"kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5\") pod \"d294865e-7999-4e81-818f-3a5db24b7f01\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.735623 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory\") pod \"d294865e-7999-4e81-818f-3a5db24b7f01\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.735650 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key\") pod \"d294865e-7999-4e81-818f-3a5db24b7f01\" (UID: \"d294865e-7999-4e81-818f-3a5db24b7f01\") " Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.742317 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5" (OuterVolumeSpecName: "kube-api-access-5bqw5") pod "d294865e-7999-4e81-818f-3a5db24b7f01" (UID: "d294865e-7999-4e81-818f-3a5db24b7f01"). InnerVolumeSpecName "kube-api-access-5bqw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.748724 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d294865e-7999-4e81-818f-3a5db24b7f01" (UID: "d294865e-7999-4e81-818f-3a5db24b7f01"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.773558 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d294865e-7999-4e81-818f-3a5db24b7f01" (UID: "d294865e-7999-4e81-818f-3a5db24b7f01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.774139 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory" (OuterVolumeSpecName: "inventory") pod "d294865e-7999-4e81-818f-3a5db24b7f01" (UID: "d294865e-7999-4e81-818f-3a5db24b7f01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.839487 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.839867 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.839888 4954 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d294865e-7999-4e81-818f-3a5db24b7f01-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:06 crc kubenswrapper[4954]: I1127 17:05:06.839923 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bqw5\" (UniqueName: \"kubernetes.io/projected/d294865e-7999-4e81-818f-3a5db24b7f01-kube-api-access-5bqw5\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.241527 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" event={"ID":"d294865e-7999-4e81-818f-3a5db24b7f01","Type":"ContainerDied","Data":"d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df"} Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.241656 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00da28c6faaea5761f5f5cb6003938c568bd0901d1c126f2417833f2894f2df" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.241725 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.314769 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8"] Nov 27 17:05:07 crc kubenswrapper[4954]: E1127 17:05:07.315171 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d294865e-7999-4e81-818f-3a5db24b7f01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.315192 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d294865e-7999-4e81-818f-3a5db24b7f01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.315373 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d294865e-7999-4e81-818f-3a5db24b7f01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.316117 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.317863 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.318052 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.319429 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.319629 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.329550 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8"] Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.458293 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.458366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82r4\" (UniqueName: \"kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.458672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.560226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.560305 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82r4\" (UniqueName: \"kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.560400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.564347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.564464 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.577843 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82r4\" (UniqueName: \"kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qvnb8\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:07 crc kubenswrapper[4954]: I1127 17:05:07.634391 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:08 crc kubenswrapper[4954]: W1127 17:05:08.139416 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39bece64_6033_4ca3_846d_6718f68f1f6d.slice/crio-17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93 WatchSource:0}: Error finding container 17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93: Status 404 returned error can't find the container with id 17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93 Nov 27 17:05:08 crc kubenswrapper[4954]: I1127 17:05:08.139548 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8"] Nov 27 17:05:08 crc kubenswrapper[4954]: I1127 17:05:08.255439 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" event={"ID":"39bece64-6033-4ca3-846d-6718f68f1f6d","Type":"ContainerStarted","Data":"17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93"} Nov 27 17:05:09 crc kubenswrapper[4954]: I1127 17:05:09.265670 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" event={"ID":"39bece64-6033-4ca3-846d-6718f68f1f6d","Type":"ContainerStarted","Data":"1310dbf433d622eccf4a32c09199ca54f22e90e2ea8c35589f1da4181b2c3e86"} Nov 27 17:05:09 crc kubenswrapper[4954]: I1127 17:05:09.287082 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" podStartSLOduration=2.088913567 podStartE2EDuration="2.287058097s" podCreationTimestamp="2025-11-27 17:05:07 +0000 UTC" firstStartedPulling="2025-11-27 17:05:08.144190274 +0000 UTC m=+1620.161630574" lastFinishedPulling="2025-11-27 17:05:08.342334804 +0000 UTC m=+1620.359775104" observedRunningTime="2025-11-27 17:05:09.277504925 +0000 UTC m=+1621.294945235" watchObservedRunningTime="2025-11-27 17:05:09.287058097 +0000 UTC m=+1621.304498417" Nov 27 17:05:09 crc kubenswrapper[4954]: I1127 17:05:09.858452 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:09 crc kubenswrapper[4954]: I1127 17:05:09.858521 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:09 crc kubenswrapper[4954]: I1127 17:05:09.916211 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:10 crc kubenswrapper[4954]: I1127 17:05:10.345689 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:10 crc kubenswrapper[4954]: I1127 17:05:10.405403 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:05:11 crc kubenswrapper[4954]: I1127 17:05:11.288463 4954 generic.go:334] "Generic (PLEG): container finished" podID="39bece64-6033-4ca3-846d-6718f68f1f6d" containerID="1310dbf433d622eccf4a32c09199ca54f22e90e2ea8c35589f1da4181b2c3e86" exitCode=0 Nov 27 17:05:11 crc kubenswrapper[4954]: I1127 17:05:11.288562 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" event={"ID":"39bece64-6033-4ca3-846d-6718f68f1f6d","Type":"ContainerDied","Data":"1310dbf433d622eccf4a32c09199ca54f22e90e2ea8c35589f1da4181b2c3e86"} Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.297108 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2vzl" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="registry-server" containerID="cri-o://324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7" gracePeriod=2 Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.300303 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.300375 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.354421 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.738993 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.746960 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.867946 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory\") pod \"39bece64-6033-4ca3-846d-6718f68f1f6d\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.868196 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content\") pod \"939a6c80-06c3-49eb-8224-f39cd7998055\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.868224 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key\") pod \"39bece64-6033-4ca3-846d-6718f68f1f6d\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.868293 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities\") pod \"939a6c80-06c3-49eb-8224-f39cd7998055\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.868318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9n9\" (UniqueName: \"kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9\") pod \"939a6c80-06c3-49eb-8224-f39cd7998055\" (UID: \"939a6c80-06c3-49eb-8224-f39cd7998055\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.868338 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h82r4\" (UniqueName: \"kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4\") pod \"39bece64-6033-4ca3-846d-6718f68f1f6d\" (UID: \"39bece64-6033-4ca3-846d-6718f68f1f6d\") " Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.869227 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities" (OuterVolumeSpecName: "utilities") pod "939a6c80-06c3-49eb-8224-f39cd7998055" (UID: "939a6c80-06c3-49eb-8224-f39cd7998055"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.874111 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9" (OuterVolumeSpecName: "kube-api-access-9p9n9") pod "939a6c80-06c3-49eb-8224-f39cd7998055" (UID: "939a6c80-06c3-49eb-8224-f39cd7998055"). InnerVolumeSpecName "kube-api-access-9p9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.874158 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4" (OuterVolumeSpecName: "kube-api-access-h82r4") pod "39bece64-6033-4ca3-846d-6718f68f1f6d" (UID: "39bece64-6033-4ca3-846d-6718f68f1f6d"). InnerVolumeSpecName "kube-api-access-h82r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.897421 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39bece64-6033-4ca3-846d-6718f68f1f6d" (UID: "39bece64-6033-4ca3-846d-6718f68f1f6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.898286 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory" (OuterVolumeSpecName: "inventory") pod "39bece64-6033-4ca3-846d-6718f68f1f6d" (UID: "39bece64-6033-4ca3-846d-6718f68f1f6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.918615 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "939a6c80-06c3-49eb-8224-f39cd7998055" (UID: "939a6c80-06c3-49eb-8224-f39cd7998055"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971328 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971631 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971692 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a6c80-06c3-49eb-8224-f39cd7998055-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971755 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9n9\" (UniqueName: \"kubernetes.io/projected/939a6c80-06c3-49eb-8224-f39cd7998055-kube-api-access-9p9n9\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971809 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h82r4\" (UniqueName: \"kubernetes.io/projected/39bece64-6033-4ca3-846d-6718f68f1f6d-kube-api-access-h82r4\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:12 crc kubenswrapper[4954]: I1127 17:05:12.971865 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39bece64-6033-4ca3-846d-6718f68f1f6d-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.306654 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" event={"ID":"39bece64-6033-4ca3-846d-6718f68f1f6d","Type":"ContainerDied","Data":"17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93"} Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.306701 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17978ec81e9a7b2374bfcb68e585cf1f75d6ccb38e91fc0f696e8c6f64190a93" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.306710 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qvnb8" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.309729 4954 generic.go:334] "Generic (PLEG): container finished" podID="939a6c80-06c3-49eb-8224-f39cd7998055" containerID="324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7" exitCode=0 Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.309773 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerDied","Data":"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7"} Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.309815 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2vzl" event={"ID":"939a6c80-06c3-49eb-8224-f39cd7998055","Type":"ContainerDied","Data":"47f821eb0674f595f3af1055c9bb1c3585fa8238a82d7417823347cfc9b5ccea"} Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.309842 4954 scope.go:117] "RemoveContainer" containerID="324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.310032 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2vzl" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.361152 4954 scope.go:117] "RemoveContainer" containerID="2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.374280 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.375310 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.388352 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2vzl"] Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401115 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns"] Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.401478 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401496 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.401509 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="extract-content" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401514 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="extract-content" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.401539 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="extract-utilities" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401546 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="extract-utilities" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.401562 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bece64-6033-4ca3-846d-6718f68f1f6d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401570 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bece64-6033-4ca3-846d-6718f68f1f6d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401762 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.401784 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bece64-6033-4ca3-846d-6718f68f1f6d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.402467 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.404390 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.404792 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.404904 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.404972 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.410768 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns"] Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.440380 4954 scope.go:117] "RemoveContainer" containerID="16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.497812 4954 scope.go:117] "RemoveContainer" containerID="324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.498351 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7\": container with ID starting with 324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7 not found: ID does not exist" containerID="324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.498387 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7"} err="failed to get container status \"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7\": rpc error: code = NotFound desc = could not find container \"324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7\": container with ID starting with 324367c1442d486ed48233eee347ab0369f28b09cc78ae7afd9a65f8fd9b20c7 not found: ID does not exist" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.498412 4954 scope.go:117] "RemoveContainer" containerID="2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.499023 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d\": container with ID starting with 2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d not found: ID does not exist" containerID="2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.499049 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d"} err="failed to get container status \"2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d\": rpc error: code = NotFound desc = could not find container \"2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d\": container with ID starting with 2cb2bab21709abeb5013bb6321ee6acf51adb13a819758e623dc3cc541dcdb1d not found: ID does not exist" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.499066 4954 scope.go:117] "RemoveContainer" containerID="16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1" Nov 27 17:05:13 crc kubenswrapper[4954]: E1127 17:05:13.499400 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1\": container with ID starting with 16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1 not found: ID does not exist" containerID="16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.499443 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1"} err="failed to get container status \"16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1\": rpc error: code = NotFound desc = could not find container \"16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1\": container with ID starting with 16de501602d2a7128e0b4b7c8c0964f110edbb2c5394124d289f72e2f80aa3a1 not found: ID does not exist" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.582411 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.582448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.582488 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.582574 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cb6v\" (UniqueName: \"kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.683741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.684119 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.684162 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.684251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cb6v\" (UniqueName: \"kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.688067 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.688175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.688621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.704458 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cb6v\" (UniqueName: \"kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:13 crc kubenswrapper[4954]: I1127 17:05:13.820973 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:05:14 crc kubenswrapper[4954]: I1127 17:05:14.319246 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns"] Nov 27 17:05:14 crc kubenswrapper[4954]: I1127 17:05:14.675526 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939a6c80-06c3-49eb-8224-f39cd7998055" path="/var/lib/kubelet/pods/939a6c80-06c3-49eb-8224-f39cd7998055/volumes" Nov 27 17:05:15 crc kubenswrapper[4954]: I1127 17:05:15.331310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" event={"ID":"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04","Type":"ContainerStarted","Data":"047bd7285343a55e06e6be6b14f1cd542a5595b8eeedc1afb780ad7adc0a0cd6"} Nov 27 17:05:15 crc kubenswrapper[4954]: I1127 17:05:15.331371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" event={"ID":"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04","Type":"ContainerStarted","Data":"24416b92e5dd6f0a91ffd995ebdab4468463122b866d25062458b70339f06663"} Nov 27 17:05:15 crc kubenswrapper[4954]: I1127 17:05:15.749352 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" podStartSLOduration=2.490381814 podStartE2EDuration="2.74932916s" podCreationTimestamp="2025-11-27 17:05:13 +0000 UTC" firstStartedPulling="2025-11-27 17:05:14.32321423 +0000 UTC m=+1626.340654530" lastFinishedPulling="2025-11-27 17:05:14.582161576 +0000 UTC m=+1626.599601876" observedRunningTime="2025-11-27 17:05:15.354892534 +0000 UTC m=+1627.372332834" watchObservedRunningTime="2025-11-27 17:05:15.74932916 +0000 UTC m=+1627.766769490" Nov 27 17:05:15 crc kubenswrapper[4954]: I1127 17:05:15.765986 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:05:15 crc kubenswrapper[4954]: I1127 17:05:15.766664 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5r7q5" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="registry-server" containerID="cri-o://b7251c08e7f13963283e0e97cff136a581062a08250df0fcf78b8a94988f13ff" gracePeriod=2 Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.345640 4954 generic.go:334] "Generic (PLEG): container finished" podID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerID="b7251c08e7f13963283e0e97cff136a581062a08250df0fcf78b8a94988f13ff" exitCode=0 Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.346248 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerDied","Data":"b7251c08e7f13963283e0e97cff136a581062a08250df0fcf78b8a94988f13ff"} Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.750883 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.852468 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities\") pod \"d215184b-28f6-4404-83d3-c1ae2c78f789\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.852681 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzq5p\" (UniqueName: \"kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p\") pod \"d215184b-28f6-4404-83d3-c1ae2c78f789\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.852760 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content\") pod \"d215184b-28f6-4404-83d3-c1ae2c78f789\" (UID: \"d215184b-28f6-4404-83d3-c1ae2c78f789\") " Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.853754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities" (OuterVolumeSpecName: "utilities") pod "d215184b-28f6-4404-83d3-c1ae2c78f789" (UID: "d215184b-28f6-4404-83d3-c1ae2c78f789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.864206 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p" (OuterVolumeSpecName: "kube-api-access-vzq5p") pod "d215184b-28f6-4404-83d3-c1ae2c78f789" (UID: "d215184b-28f6-4404-83d3-c1ae2c78f789"). InnerVolumeSpecName "kube-api-access-vzq5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.914232 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d215184b-28f6-4404-83d3-c1ae2c78f789" (UID: "d215184b-28f6-4404-83d3-c1ae2c78f789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.955684 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.955725 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d215184b-28f6-4404-83d3-c1ae2c78f789-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:16 crc kubenswrapper[4954]: I1127 17:05:16.955736 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzq5p\" (UniqueName: \"kubernetes.io/projected/d215184b-28f6-4404-83d3-c1ae2c78f789-kube-api-access-vzq5p\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.356456 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7q5" event={"ID":"d215184b-28f6-4404-83d3-c1ae2c78f789","Type":"ContainerDied","Data":"295e3135fa044dc2b3e7efb68c8a2636412809f584b8bbdc6f73f9db2b347d7f"} Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.356493 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7q5" Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.356512 4954 scope.go:117] "RemoveContainer" containerID="b7251c08e7f13963283e0e97cff136a581062a08250df0fcf78b8a94988f13ff" Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.393912 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.398700 4954 scope.go:117] "RemoveContainer" containerID="79ff2f66c253bb94c6da5d07289766bdfc43de026c3aaf939d6244a86ab9cf81" Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.403195 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5r7q5"] Nov 27 17:05:17 crc kubenswrapper[4954]: I1127 17:05:17.428559 4954 scope.go:117] "RemoveContainer" containerID="6e7413aadb5285b731bf044659905ca287a9536bf3e56ccb8b996e9579620330" Nov 27 17:05:18 crc kubenswrapper[4954]: I1127 17:05:18.675264 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" path="/var/lib/kubelet/pods/d215184b-28f6-4404-83d3-c1ae2c78f789/volumes" Nov 27 17:05:23 crc kubenswrapper[4954]: I1127 17:05:23.688049 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:05:23 crc kubenswrapper[4954]: I1127 17:05:23.688754 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:05:44 crc kubenswrapper[4954]: I1127 17:05:44.204313 4954 scope.go:117] "RemoveContainer" containerID="53c366354a5cb400bc91690989142c15151c801937a32af334adb3754f67e604" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.687185 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.689387 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.689640 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.690939 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.691268 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" gracePeriod=600 Nov 27 17:05:53 crc kubenswrapper[4954]: E1127 17:05:53.825676 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.932489 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" exitCode=0 Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.932533 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f"} Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.932564 4954 scope.go:117] "RemoveContainer" containerID="98580182e2338285c15b00e549725c7d4113004bcbddaa6d1d4c9e028f47ac7f" Nov 27 17:05:53 crc kubenswrapper[4954]: I1127 17:05:53.933427 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:05:53 crc kubenswrapper[4954]: E1127 17:05:53.933823 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:06:05 crc kubenswrapper[4954]: I1127 17:06:05.662801 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:06:05 crc kubenswrapper[4954]: E1127 17:06:05.663889 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:06:17 crc kubenswrapper[4954]: I1127 17:06:17.661800 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:06:17 crc kubenswrapper[4954]: E1127 17:06:17.662636 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:06:29 crc kubenswrapper[4954]: I1127 17:06:29.663008 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:06:29 crc kubenswrapper[4954]: E1127 17:06:29.665554 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:06:43 crc kubenswrapper[4954]: I1127 17:06:43.662807 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:06:43 crc kubenswrapper[4954]: E1127 17:06:43.664933 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:06:44 crc kubenswrapper[4954]: I1127 17:06:44.316383 4954 scope.go:117] "RemoveContainer" containerID="2af38ba85189c1fc90987a2e280583686a0c7d3d391b7dc2d66189d93f055823" Nov 27 17:06:58 crc kubenswrapper[4954]: I1127 17:06:58.669019 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:06:58 crc kubenswrapper[4954]: E1127 17:06:58.669973 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:07:13 crc kubenswrapper[4954]: I1127 17:07:13.661383 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:07:13 crc kubenswrapper[4954]: E1127 17:07:13.662433 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:07:27 crc kubenswrapper[4954]: I1127 17:07:27.663042 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:07:27 crc kubenswrapper[4954]: E1127 17:07:27.664520 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:07:38 crc kubenswrapper[4954]: I1127 17:07:38.668669 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:07:38 crc kubenswrapper[4954]: E1127 17:07:38.669460 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:07:52 crc kubenswrapper[4954]: I1127 17:07:52.664717 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:07:52 crc kubenswrapper[4954]: E1127 17:07:52.665656 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:08:05 crc kubenswrapper[4954]: I1127 17:08:05.662065 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:08:05 crc kubenswrapper[4954]: E1127 17:08:05.662911 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.039912 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mmfbq"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.051874 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pk6xr"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.061804 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dqrlj"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.070706 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mmfbq"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.080424 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dqrlj"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.088732 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pk6xr"] Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.669596 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:08:18 crc kubenswrapper[4954]: E1127 17:08:18.670105 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.673563 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060ee5fd-88d7-4172-8196-ffeeb09be3b6" path="/var/lib/kubelet/pods/060ee5fd-88d7-4172-8196-ffeeb09be3b6/volumes" Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.674287 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5522ab-dc06-4c46-8a1b-fa7d94b058e1" path="/var/lib/kubelet/pods/4b5522ab-dc06-4c46-8a1b-fa7d94b058e1/volumes" Nov 27 17:08:18 crc kubenswrapper[4954]: I1127 17:08:18.674891 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd20293-bf3f-44be-b18d-d6053638d393" path="/var/lib/kubelet/pods/fdd20293-bf3f-44be-b18d-d6053638d393/volumes" Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.034839 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f6f0-account-create-update-jlfdn"] Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.052272 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e64-account-create-update-g7spt"] Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.065117 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ae31-account-create-update-xs8vj"] Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.078020 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1e64-account-create-update-g7spt"] Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.089728 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f6f0-account-create-update-jlfdn"] Nov 27 17:08:21 crc kubenswrapper[4954]: I1127 17:08:21.098942 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ae31-account-create-update-xs8vj"] Nov 27 17:08:22 crc kubenswrapper[4954]: I1127 17:08:22.676470 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f1d5b5-69ba-453c-90cc-85210e24e5d3" path="/var/lib/kubelet/pods/a4f1d5b5-69ba-453c-90cc-85210e24e5d3/volumes" Nov 27 17:08:22 crc kubenswrapper[4954]: I1127 17:08:22.677179 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7e3231-2480-4075-80dc-0f44cc159964" path="/var/lib/kubelet/pods/bf7e3231-2480-4075-80dc-0f44cc159964/volumes" Nov 27 17:08:22 crc kubenswrapper[4954]: I1127 17:08:22.677716 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf895a7e-aada-4a88-814e-1a6b38ff6616" path="/var/lib/kubelet/pods/cf895a7e-aada-4a88-814e-1a6b38ff6616/volumes" Nov 27 17:08:33 crc kubenswrapper[4954]: I1127 17:08:33.662562 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:08:33 crc kubenswrapper[4954]: E1127 17:08:33.664454 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.411525 4954 scope.go:117] "RemoveContainer" containerID="e8566ffd34f602562600b8c86c0a06660adc9ddf8997a5f3ac7b3cd41941d16e" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.433380 4954 scope.go:117] "RemoveContainer" containerID="6a9403d61bcd8b3dc18e2526fbb06088d65d4e29e63d9c5274f8626513548ed0" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.452352 4954 scope.go:117] "RemoveContainer" containerID="9fab585b26cdd5fc6d56ebc649e3e35945e283fe38823ea96461b370ae5e58e2" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.474065 4954 scope.go:117] "RemoveContainer" containerID="270984d6e913433dd553a4e4f306d7220f48cb90b9e51da0c0e03535c2dfc081" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.526109 4954 scope.go:117] "RemoveContainer" containerID="393d06451177be61fcf0672ff85912ca2b9d32b28a5cac8fa99cfba9333a0d29" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.582164 4954 scope.go:117] "RemoveContainer" containerID="c8db0f80cc8e084c62ef2269a947511d008a0b24f0b041f05e7868bcba22bb72" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.615024 4954 scope.go:117] "RemoveContainer" containerID="1b6a8b850d7dfa55bc34e5fd934ca1b713624d490e0696e38242d855913c0504" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.639147 4954 scope.go:117] "RemoveContainer" containerID="b7434fc61404163fce7eeff0f752c65353a9cb203e6d49623258ac0b38d162c8" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.676675 4954 scope.go:117] "RemoveContainer" containerID="ef7cb6324e37e0f74cb1580c1addb8ed96937c4f8b62f85ec8198a04c2fc51d5" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.695560 4954 scope.go:117] "RemoveContainer" containerID="67adc7278a60297cd96e209646877a1cf4c2602d7783c4b6b4926f79ebb30f22" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.739650 4954 scope.go:117] "RemoveContainer" containerID="1c47d1e6187a7cc94e3378e5dca8ae078b8e60905b259c2e5e67ca6cb3f05fa3" Nov 27 17:08:44 crc kubenswrapper[4954]: I1127 17:08:44.767392 4954 scope.go:117] "RemoveContainer" containerID="54044b1877e09ff186089b7d3d45b6a8af05c37b117389c34d43a1899893c0d5" Nov 27 17:08:45 crc kubenswrapper[4954]: E1127 17:08:45.496174 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ad0395_6bb9_46b3_a81b_3f4b1c2dad04.slice/crio-047bd7285343a55e06e6be6b14f1cd542a5595b8eeedc1afb780ad7adc0a0cd6.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:08:46 crc kubenswrapper[4954]: I1127 17:08:46.558162 4954 generic.go:334] "Generic (PLEG): container finished" podID="98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" containerID="047bd7285343a55e06e6be6b14f1cd542a5595b8eeedc1afb780ad7adc0a0cd6" exitCode=0 Nov 27 17:08:46 crc kubenswrapper[4954]: I1127 17:08:46.558260 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" event={"ID":"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04","Type":"ContainerDied","Data":"047bd7285343a55e06e6be6b14f1cd542a5595b8eeedc1afb780ad7adc0a0cd6"} Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.069498 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4twzj"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.082692 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mhd4j"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.093610 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4twzj"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.104503 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mhd4j"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.116860 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-924f-account-create-update-qs4ck"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.129207 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-924f-account-create-update-qs4ck"] Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.663489 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:08:47 crc kubenswrapper[4954]: E1127 17:08:47.663757 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:08:47 crc kubenswrapper[4954]: I1127 17:08:47.971157 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.031315 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle\") pod \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.031447 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cb6v\" (UniqueName: \"kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v\") pod \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.031498 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key\") pod \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.031603 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory\") pod \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\" (UID: \"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04\") " Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.036869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v" (OuterVolumeSpecName: "kube-api-access-8cb6v") pod "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" (UID: "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04"). InnerVolumeSpecName "kube-api-access-8cb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.036960 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" (UID: "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.057525 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" (UID: "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.064528 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory" (OuterVolumeSpecName: "inventory") pod "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" (UID: "98ad0395-6bb9-46b3-a81b-3f4b1c2dad04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.134149 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cb6v\" (UniqueName: \"kubernetes.io/projected/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-kube-api-access-8cb6v\") on node \"crc\" DevicePath \"\"" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.134383 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.134441 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.134493 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ad0395-6bb9-46b3-a81b-3f4b1c2dad04-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.576207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" event={"ID":"98ad0395-6bb9-46b3-a81b-3f4b1c2dad04","Type":"ContainerDied","Data":"24416b92e5dd6f0a91ffd995ebdab4468463122b866d25062458b70339f06663"} Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.576248 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24416b92e5dd6f0a91ffd995ebdab4468463122b866d25062458b70339f06663" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.576319 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.675231 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56abe05c-60fe-4797-9b81-0ba5fa342149" path="/var/lib/kubelet/pods/56abe05c-60fe-4797-9b81-0ba5fa342149/volumes" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.675874 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958ac579-b5c6-47ae-9b39-13abfc4da1db" path="/var/lib/kubelet/pods/958ac579-b5c6-47ae-9b39-13abfc4da1db/volumes" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.676432 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6f0251-00d1-460c-82fb-d86f5142c5f1" path="/var/lib/kubelet/pods/fe6f0251-00d1-460c-82fb-d86f5142c5f1/volumes" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677057 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k"] Nov 27 17:08:48 crc kubenswrapper[4954]: E1127 17:08:48.677428 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677457 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 17:08:48 crc kubenswrapper[4954]: E1127 17:08:48.677488 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="registry-server" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677498 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="registry-server" Nov 27 17:08:48 crc kubenswrapper[4954]: E1127 17:08:48.677519 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="extract-content" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677528 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="extract-content" Nov 27 17:08:48 crc kubenswrapper[4954]: E1127 17:08:48.677619 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="extract-utilities" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677634 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="extract-utilities" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677863 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d215184b-28f6-4404-83d3-c1ae2c78f789" containerName="registry-server" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.677910 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ad0395-6bb9-46b3-a81b-3f4b1c2dad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.679490 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.680244 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k"] Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.684677 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.684865 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.685017 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.685180 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.847349 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.847812 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf64g\" (UniqueName: \"kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.847909 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.950509 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.950595 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf64g\" (UniqueName: \"kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.950640 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.956976 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.958966 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:48 crc kubenswrapper[4954]: I1127 17:08:48.970819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf64g\" (UniqueName: \"kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:49 crc kubenswrapper[4954]: I1127 17:08:49.009545 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:08:49 crc kubenswrapper[4954]: I1127 17:08:49.540278 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:08:49 crc kubenswrapper[4954]: I1127 17:08:49.542219 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k"] Nov 27 17:08:49 crc kubenswrapper[4954]: I1127 17:08:49.586545 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" event={"ID":"59b766b5-12a6-4e9c-b627-3d7705a04afc","Type":"ContainerStarted","Data":"8b01f96e68188420746371c957841cacb1582f76ab3a93a7a7238f3cfa174bbe"} Nov 27 17:08:50 crc kubenswrapper[4954]: I1127 17:08:50.598773 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" event={"ID":"59b766b5-12a6-4e9c-b627-3d7705a04afc","Type":"ContainerStarted","Data":"74d9b1601b72e831b8de80ffebfca4db6f3699b3a27fd60e8bfe2c29fc3082c4"} Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.026266 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" podStartSLOduration=6.501521301 podStartE2EDuration="7.026245104s" podCreationTimestamp="2025-11-27 17:08:48 +0000 UTC" firstStartedPulling="2025-11-27 17:08:49.539694763 +0000 UTC m=+1841.557135103" lastFinishedPulling="2025-11-27 17:08:50.064418566 +0000 UTC m=+1842.081858906" observedRunningTime="2025-11-27 17:08:50.61775129 +0000 UTC m=+1842.635191590" watchObservedRunningTime="2025-11-27 17:08:55.026245104 +0000 UTC m=+1847.043685404" Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.041995 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-606e-account-create-update-qqqpz"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.059856 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-06e3-account-create-update-gx6gr"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.072733 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wlq9c"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.080753 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-606e-account-create-update-qqqpz"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.089388 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-g8ngm"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.097069 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-06e3-account-create-update-gx6gr"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.104388 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-g8ngm"] Nov 27 17:08:55 crc kubenswrapper[4954]: I1127 17:08:55.112313 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wlq9c"] Nov 27 17:08:56 crc kubenswrapper[4954]: I1127 17:08:56.673731 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8278f5-f0bb-4b86-b187-c8b047a338e3" path="/var/lib/kubelet/pods/0a8278f5-f0bb-4b86-b187-c8b047a338e3/volumes" Nov 27 17:08:56 crc kubenswrapper[4954]: I1127 17:08:56.674835 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10709af5-22d7-4aaf-963a-c7b1a67d61db" path="/var/lib/kubelet/pods/10709af5-22d7-4aaf-963a-c7b1a67d61db/volumes" Nov 27 17:08:56 crc kubenswrapper[4954]: I1127 17:08:56.675563 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f849c06-6adb-4c74-b851-b261c6797f6b" path="/var/lib/kubelet/pods/2f849c06-6adb-4c74-b851-b261c6797f6b/volumes" Nov 27 17:08:56 crc kubenswrapper[4954]: I1127 17:08:56.676328 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604569c5-bcdb-49ba-8fad-546903367900" path="/var/lib/kubelet/pods/604569c5-bcdb-49ba-8fad-546903367900/volumes" Nov 27 17:08:59 crc kubenswrapper[4954]: I1127 17:08:59.662460 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:08:59 crc kubenswrapper[4954]: E1127 17:08:59.663315 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:09:02 crc kubenswrapper[4954]: I1127 17:09:02.062432 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c97tg"] Nov 27 17:09:02 crc kubenswrapper[4954]: I1127 17:09:02.075702 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c97tg"] Nov 27 17:09:02 crc kubenswrapper[4954]: I1127 17:09:02.673789 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243dbf8f-7ced-4de5-8c00-b205546b0db2" path="/var/lib/kubelet/pods/243dbf8f-7ced-4de5-8c00-b205546b0db2/volumes" Nov 27 17:09:11 crc kubenswrapper[4954]: I1127 17:09:11.663149 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:09:11 crc kubenswrapper[4954]: E1127 17:09:11.663873 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:09:22 crc kubenswrapper[4954]: I1127 17:09:22.663052 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:09:22 crc kubenswrapper[4954]: E1127 17:09:22.664340 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:09:34 crc kubenswrapper[4954]: I1127 17:09:34.663454 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:09:34 crc kubenswrapper[4954]: E1127 17:09:34.665523 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:09:44 crc kubenswrapper[4954]: I1127 17:09:44.038650 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fcrnt"] Nov 27 17:09:44 crc kubenswrapper[4954]: I1127 17:09:44.046257 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fcrnt"] Nov 27 17:09:44 crc kubenswrapper[4954]: I1127 17:09:44.675539 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50892b2e-4e6f-4794-bb8d-e649a9b223fc" path="/var/lib/kubelet/pods/50892b2e-4e6f-4794-bb8d-e649a9b223fc/volumes" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.024468 4954 scope.go:117] "RemoveContainer" containerID="04637028b8a8e2fa1452af4d1d1eb9b585777b52cae2187a2fde615588b12543" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.067384 4954 scope.go:117] "RemoveContainer" containerID="5a552d795c4a9f561604e4aa4659efec65258503d374da8b32c15c2f8f7c5d4b" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.114820 4954 scope.go:117] "RemoveContainer" containerID="7ddce387a5ede953ec464571c749c2df54c0bac4f4a37be7eea5b829fdaa5ffd" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.161104 4954 scope.go:117] "RemoveContainer" containerID="e92a08eee1b7abd3fa9259d5653545521d191dd9dfb8bba8cf3bdb2de482da9a" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.198888 4954 scope.go:117] "RemoveContainer" containerID="c4cd115b731192b87ccc81c1b09180efcd191973e1a160928b1d0aeeba85b8d6" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.244716 4954 scope.go:117] "RemoveContainer" containerID="033d11af1c2e36e7d71877588fbe192027603aa475cde4ca986817a0a319fb5b" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.284987 4954 scope.go:117] "RemoveContainer" containerID="0bfdf88c96edf13ce3404bb30d88ebf37431bbf997be6360e69e26ec3448ffc4" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.305443 4954 scope.go:117] "RemoveContainer" containerID="986a0c88c524f676cfbcdfdedd620d2f20280129d7e5f624a37729159fd248d6" Nov 27 17:09:45 crc kubenswrapper[4954]: I1127 17:09:45.327728 4954 scope.go:117] "RemoveContainer" containerID="993b586601c3b86b6ab6d17c37f96718cf6952b28e6d127c07e53c460d21cf9e" Nov 27 17:09:46 crc kubenswrapper[4954]: I1127 17:09:46.662103 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:09:46 crc kubenswrapper[4954]: E1127 17:09:46.662714 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:09:50 crc kubenswrapper[4954]: I1127 17:09:50.029301 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cs55t"] Nov 27 17:09:50 crc kubenswrapper[4954]: I1127 17:09:50.036514 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cs55t"] Nov 27 17:09:50 crc kubenswrapper[4954]: I1127 17:09:50.672795 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9758394-0bfc-487a-99b4-a3583a2c97b0" path="/var/lib/kubelet/pods/b9758394-0bfc-487a-99b4-a3583a2c97b0/volumes" Nov 27 17:09:57 crc kubenswrapper[4954]: I1127 17:09:57.662986 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:09:57 crc kubenswrapper[4954]: E1127 17:09:57.663830 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:10:04 crc kubenswrapper[4954]: I1127 17:10:04.055455 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6vl85"] Nov 27 17:10:04 crc kubenswrapper[4954]: I1127 17:10:04.066245 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6vl85"] Nov 27 17:10:04 crc kubenswrapper[4954]: I1127 17:10:04.678749 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0123682b-b80c-436f-bf07-6252dc3df9bc" path="/var/lib/kubelet/pods/0123682b-b80c-436f-bf07-6252dc3df9bc/volumes" Nov 27 17:10:05 crc kubenswrapper[4954]: I1127 17:10:05.039814 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x4n64"] Nov 27 17:10:05 crc kubenswrapper[4954]: I1127 17:10:05.049562 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x4n64"] Nov 27 17:10:06 crc kubenswrapper[4954]: I1127 17:10:06.042742 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hwpt7"] Nov 27 17:10:06 crc kubenswrapper[4954]: I1127 17:10:06.055887 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hwpt7"] Nov 27 17:10:06 crc kubenswrapper[4954]: I1127 17:10:06.673619 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bce3669-a584-4f00-8043-90be729c9fa7" path="/var/lib/kubelet/pods/1bce3669-a584-4f00-8043-90be729c9fa7/volumes" Nov 27 17:10:06 crc kubenswrapper[4954]: I1127 17:10:06.674261 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c181b9-bc11-4747-84ad-5302f1265507" path="/var/lib/kubelet/pods/58c181b9-bc11-4747-84ad-5302f1265507/volumes" Nov 27 17:10:12 crc kubenswrapper[4954]: I1127 17:10:12.663113 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:10:12 crc kubenswrapper[4954]: E1127 17:10:12.666316 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:10:23 crc kubenswrapper[4954]: I1127 17:10:23.662868 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:10:23 crc kubenswrapper[4954]: E1127 17:10:23.663647 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:10:38 crc kubenswrapper[4954]: I1127 17:10:38.669595 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:10:38 crc kubenswrapper[4954]: E1127 17:10:38.670398 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:10:45 crc kubenswrapper[4954]: I1127 17:10:45.529206 4954 scope.go:117] "RemoveContainer" containerID="96ab67ada370f118a852be06f22b4780ef4f10b62ac840007ffbf097403f3c43" Nov 27 17:10:45 crc kubenswrapper[4954]: I1127 17:10:45.602455 4954 scope.go:117] "RemoveContainer" containerID="4dd8cb7ee521604965cffb6715b2ec94f9b2a1336df00b7533b148e731686fb0" Nov 27 17:10:45 crc kubenswrapper[4954]: I1127 17:10:45.629392 4954 scope.go:117] "RemoveContainer" containerID="901d635a5ed3dc985c4adf2144e2377826394d07154039a33c23b126755f620f" Nov 27 17:10:45 crc kubenswrapper[4954]: I1127 17:10:45.671976 4954 scope.go:117] "RemoveContainer" containerID="c7f389f6069feb0c78353dad9ae7b9a0245dfcd17c6f4f3ea3f1ab0fbba286e8" Nov 27 17:10:50 crc kubenswrapper[4954]: I1127 17:10:50.662975 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:10:50 crc kubenswrapper[4954]: E1127 17:10:50.663771 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:11:05 crc kubenswrapper[4954]: I1127 17:11:05.662548 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:11:06 crc kubenswrapper[4954]: I1127 17:11:06.401568 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8"} Nov 27 17:11:08 crc kubenswrapper[4954]: I1127 17:11:08.056622 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5dndx"] Nov 27 17:11:08 crc kubenswrapper[4954]: I1127 17:11:08.065944 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5dndx"] Nov 27 17:11:08 crc kubenswrapper[4954]: I1127 17:11:08.678286 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c040041-36d3-4ba0-b7c4-5164dee45115" path="/var/lib/kubelet/pods/7c040041-36d3-4ba0-b7c4-5164dee45115/volumes" Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.035620 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4663-account-create-update-5ghgg"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.043995 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7dca-account-create-update-jwm8d"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.051528 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kn544"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.060951 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-h4kmm"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.085069 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4663-account-create-update-5ghgg"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.092840 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kn544"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.099302 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7dca-account-create-update-jwm8d"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.109491 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-h4kmm"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.118133 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5e54-account-create-update-x5fjh"] Nov 27 17:11:09 crc kubenswrapper[4954]: I1127 17:11:09.126678 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5e54-account-create-update-x5fjh"] Nov 27 17:11:10 crc kubenswrapper[4954]: I1127 17:11:10.675170 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3caed139-7f27-4afa-b159-ba85dc64bd91" path="/var/lib/kubelet/pods/3caed139-7f27-4afa-b159-ba85dc64bd91/volumes" Nov 27 17:11:10 crc kubenswrapper[4954]: I1127 17:11:10.676444 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53dbb9f3-5011-4342-a4df-bcfbe5991cbf" path="/var/lib/kubelet/pods/53dbb9f3-5011-4342-a4df-bcfbe5991cbf/volumes" Nov 27 17:11:10 crc kubenswrapper[4954]: I1127 17:11:10.677296 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750c1d74-a850-4e62-9680-cd65e44a254c" path="/var/lib/kubelet/pods/750c1d74-a850-4e62-9680-cd65e44a254c/volumes" Nov 27 17:11:10 crc kubenswrapper[4954]: I1127 17:11:10.678020 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8f7cd7-b71b-4fa5-a4fa-83a528b78177" path="/var/lib/kubelet/pods/7a8f7cd7-b71b-4fa5-a4fa-83a528b78177/volumes" Nov 27 17:11:10 crc kubenswrapper[4954]: I1127 17:11:10.679380 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fb41af-f5ea-444d-aea7-a9b50124e2b4" path="/var/lib/kubelet/pods/81fb41af-f5ea-444d-aea7-a9b50124e2b4/volumes" Nov 27 17:11:28 crc kubenswrapper[4954]: I1127 17:11:28.596815 4954 generic.go:334] "Generic (PLEG): container finished" podID="59b766b5-12a6-4e9c-b627-3d7705a04afc" containerID="74d9b1601b72e831b8de80ffebfca4db6f3699b3a27fd60e8bfe2c29fc3082c4" exitCode=0 Nov 27 17:11:28 crc kubenswrapper[4954]: I1127 17:11:28.596900 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" event={"ID":"59b766b5-12a6-4e9c-b627-3d7705a04afc","Type":"ContainerDied","Data":"74d9b1601b72e831b8de80ffebfca4db6f3699b3a27fd60e8bfe2c29fc3082c4"} Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.089003 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.195318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf64g\" (UniqueName: \"kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g\") pod \"59b766b5-12a6-4e9c-b627-3d7705a04afc\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.195632 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory\") pod \"59b766b5-12a6-4e9c-b627-3d7705a04afc\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.195664 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key\") pod \"59b766b5-12a6-4e9c-b627-3d7705a04afc\" (UID: \"59b766b5-12a6-4e9c-b627-3d7705a04afc\") " Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.204440 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g" (OuterVolumeSpecName: "kube-api-access-nf64g") pod "59b766b5-12a6-4e9c-b627-3d7705a04afc" (UID: "59b766b5-12a6-4e9c-b627-3d7705a04afc"). InnerVolumeSpecName "kube-api-access-nf64g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.231454 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory" (OuterVolumeSpecName: "inventory") pod "59b766b5-12a6-4e9c-b627-3d7705a04afc" (UID: "59b766b5-12a6-4e9c-b627-3d7705a04afc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.232349 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59b766b5-12a6-4e9c-b627-3d7705a04afc" (UID: "59b766b5-12a6-4e9c-b627-3d7705a04afc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.298189 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf64g\" (UniqueName: \"kubernetes.io/projected/59b766b5-12a6-4e9c-b627-3d7705a04afc-kube-api-access-nf64g\") on node \"crc\" DevicePath \"\"" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.298232 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.298246 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59b766b5-12a6-4e9c-b627-3d7705a04afc-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.623721 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" event={"ID":"59b766b5-12a6-4e9c-b627-3d7705a04afc","Type":"ContainerDied","Data":"8b01f96e68188420746371c957841cacb1582f76ab3a93a7a7238f3cfa174bbe"} Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.623799 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b01f96e68188420746371c957841cacb1582f76ab3a93a7a7238f3cfa174bbe" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.623813 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.722916 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7"] Nov 27 17:11:30 crc kubenswrapper[4954]: E1127 17:11:30.723488 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b766b5-12a6-4e9c-b627-3d7705a04afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.723512 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b766b5-12a6-4e9c-b627-3d7705a04afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.723933 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b766b5-12a6-4e9c-b627-3d7705a04afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.724931 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.727598 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.728140 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.728175 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.730524 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.738503 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7"] Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.807374 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.807557 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgtjs\" (UniqueName: \"kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.808212 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.909538 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.909658 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgtjs\" (UniqueName: \"kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.909746 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.915706 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.917558 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:30 crc kubenswrapper[4954]: I1127 17:11:30.939921 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgtjs\" (UniqueName: \"kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-862k7\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:31 crc kubenswrapper[4954]: I1127 17:11:31.044617 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:11:31 crc kubenswrapper[4954]: I1127 17:11:31.665292 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7"] Nov 27 17:11:31 crc kubenswrapper[4954]: W1127 17:11:31.676852 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod745fc0e0_ebc3_4a97_8858_148da2dbb20d.slice/crio-86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d WatchSource:0}: Error finding container 86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d: Status 404 returned error can't find the container with id 86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d Nov 27 17:11:32 crc kubenswrapper[4954]: I1127 17:11:32.641072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" event={"ID":"745fc0e0-ebc3-4a97-8858-148da2dbb20d","Type":"ContainerStarted","Data":"5db9fcbcf2c1499d1c6486d64dce91bfe959fdf5fac508c8c638e19172dd5b0e"} Nov 27 17:11:32 crc kubenswrapper[4954]: I1127 17:11:32.641571 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" event={"ID":"745fc0e0-ebc3-4a97-8858-148da2dbb20d","Type":"ContainerStarted","Data":"86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d"} Nov 27 17:11:32 crc kubenswrapper[4954]: I1127 17:11:32.662425 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" podStartSLOduration=2.461777492 podStartE2EDuration="2.662242468s" podCreationTimestamp="2025-11-27 17:11:30 +0000 UTC" firstStartedPulling="2025-11-27 17:11:31.680128352 +0000 UTC m=+2003.697568652" lastFinishedPulling="2025-11-27 17:11:31.880593328 +0000 UTC m=+2003.898033628" observedRunningTime="2025-11-27 17:11:32.658009996 +0000 UTC m=+2004.675450296" watchObservedRunningTime="2025-11-27 17:11:32.662242468 +0000 UTC m=+2004.679682768" Nov 27 17:11:40 crc kubenswrapper[4954]: I1127 17:11:40.044352 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zsv66"] Nov 27 17:11:40 crc kubenswrapper[4954]: I1127 17:11:40.053666 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zsv66"] Nov 27 17:11:40 crc kubenswrapper[4954]: I1127 17:11:40.676474 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef92b7c-4671-4ac0-9a4e-f76233eb4c8e" path="/var/lib/kubelet/pods/aef92b7c-4671-4ac0-9a4e-f76233eb4c8e/volumes" Nov 27 17:11:45 crc kubenswrapper[4954]: I1127 17:11:45.809475 4954 scope.go:117] "RemoveContainer" containerID="bd1d4a848e6006f105b3e13575985171c028af2dbffacf23bb139996cb1a193d" Nov 27 17:11:45 crc kubenswrapper[4954]: I1127 17:11:45.842274 4954 scope.go:117] "RemoveContainer" containerID="8c825753331d40abf0426ceb20fe0cb284f245f8271eac4c33f5ac20c4c710c9" Nov 27 17:11:45 crc kubenswrapper[4954]: I1127 17:11:45.880949 4954 scope.go:117] "RemoveContainer" containerID="93bdb24f947514315bcf3e987ce9bdc590ff9c36adf2e0f8088bb5039237603f" Nov 27 17:11:45 crc kubenswrapper[4954]: I1127 17:11:45.924026 4954 scope.go:117] "RemoveContainer" containerID="4ce23aac0304c81f40256141b44c41f821694dbae99e5ce9d8d082e60062581c" Nov 27 17:11:45 crc kubenswrapper[4954]: I1127 17:11:45.968442 4954 scope.go:117] "RemoveContainer" containerID="8b72a5b34aefc43205e6fe4ecb60c0b7db52ebe1820c50c915ac7d37e091f45b" Nov 27 17:11:46 crc kubenswrapper[4954]: I1127 17:11:46.015091 4954 scope.go:117] "RemoveContainer" containerID="8d8e3a9508023c2041d5b56d00f6ecb39ac3bfe4529a587f6bd42d04185d2f05" Nov 27 17:11:46 crc kubenswrapper[4954]: I1127 17:11:46.073061 4954 scope.go:117] "RemoveContainer" containerID="3160d2b7a178b69dc55df9577811ca0bcbb9832b7fce48c4efd35afc6c0e7dfa" Nov 27 17:12:04 crc kubenswrapper[4954]: I1127 17:12:04.061884 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tw6zj"] Nov 27 17:12:04 crc kubenswrapper[4954]: I1127 17:12:04.076247 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tw6zj"] Nov 27 17:12:04 crc kubenswrapper[4954]: I1127 17:12:04.701143 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7940bb-124f-4c0f-b9fd-471a32e4c3ef" path="/var/lib/kubelet/pods/ec7940bb-124f-4c0f-b9fd-471a32e4c3ef/volumes" Nov 27 17:12:06 crc kubenswrapper[4954]: I1127 17:12:06.063191 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qxwbp"] Nov 27 17:12:06 crc kubenswrapper[4954]: I1127 17:12:06.079330 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qxwbp"] Nov 27 17:12:06 crc kubenswrapper[4954]: I1127 17:12:06.679936 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="167e2351-bc28-488d-86be-a3d038476c57" path="/var/lib/kubelet/pods/167e2351-bc28-488d-86be-a3d038476c57/volumes" Nov 27 17:12:46 crc kubenswrapper[4954]: I1127 17:12:46.209992 4954 scope.go:117] "RemoveContainer" containerID="ef3ce88a1727514fb33d40dec9dbd723fc71d9e5018cb2b7fd0e8aa3cc02eea1" Nov 27 17:12:46 crc kubenswrapper[4954]: I1127 17:12:46.280344 4954 scope.go:117] "RemoveContainer" containerID="778690843a6a8a382fe3b79b4f2d8c36249677f4882933416f90c8b59bed81bc" Nov 27 17:12:48 crc kubenswrapper[4954]: I1127 17:12:48.044799 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5bh4g"] Nov 27 17:12:48 crc kubenswrapper[4954]: I1127 17:12:48.055894 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5bh4g"] Nov 27 17:12:48 crc kubenswrapper[4954]: I1127 17:12:48.675613 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396b3047-b624-43f0-9dc1-6c8ba6ffaf7b" path="/var/lib/kubelet/pods/396b3047-b624-43f0-9dc1-6c8ba6ffaf7b/volumes" Nov 27 17:12:57 crc kubenswrapper[4954]: I1127 17:12:57.501974 4954 generic.go:334] "Generic (PLEG): container finished" podID="745fc0e0-ebc3-4a97-8858-148da2dbb20d" containerID="5db9fcbcf2c1499d1c6486d64dce91bfe959fdf5fac508c8c638e19172dd5b0e" exitCode=0 Nov 27 17:12:57 crc kubenswrapper[4954]: I1127 17:12:57.502066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" event={"ID":"745fc0e0-ebc3-4a97-8858-148da2dbb20d","Type":"ContainerDied","Data":"5db9fcbcf2c1499d1c6486d64dce91bfe959fdf5fac508c8c638e19172dd5b0e"} Nov 27 17:12:58 crc kubenswrapper[4954]: I1127 17:12:58.910686 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.021206 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key\") pod \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.021372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgtjs\" (UniqueName: \"kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs\") pod \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.021441 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory\") pod \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\" (UID: \"745fc0e0-ebc3-4a97-8858-148da2dbb20d\") " Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.027630 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs" (OuterVolumeSpecName: "kube-api-access-fgtjs") pod "745fc0e0-ebc3-4a97-8858-148da2dbb20d" (UID: "745fc0e0-ebc3-4a97-8858-148da2dbb20d"). InnerVolumeSpecName "kube-api-access-fgtjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.049392 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory" (OuterVolumeSpecName: "inventory") pod "745fc0e0-ebc3-4a97-8858-148da2dbb20d" (UID: "745fc0e0-ebc3-4a97-8858-148da2dbb20d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.052009 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "745fc0e0-ebc3-4a97-8858-148da2dbb20d" (UID: "745fc0e0-ebc3-4a97-8858-148da2dbb20d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.123414 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.123451 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgtjs\" (UniqueName: \"kubernetes.io/projected/745fc0e0-ebc3-4a97-8858-148da2dbb20d-kube-api-access-fgtjs\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.123462 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/745fc0e0-ebc3-4a97-8858-148da2dbb20d-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.521294 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" event={"ID":"745fc0e0-ebc3-4a97-8858-148da2dbb20d","Type":"ContainerDied","Data":"86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d"} Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.521720 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e33bc2982f713455f7e03307b5150c739514e10558628d1730acabd6e98c1d" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.521360 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-862k7" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.628799 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm"] Nov 27 17:12:59 crc kubenswrapper[4954]: E1127 17:12:59.629846 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745fc0e0-ebc3-4a97-8858-148da2dbb20d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.631967 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="745fc0e0-ebc3-4a97-8858-148da2dbb20d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.632704 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="745fc0e0-ebc3-4a97-8858-148da2dbb20d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.634427 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.637693 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.638160 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.638285 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.638697 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.660422 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm"] Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.741841 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.742202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd89\" (UniqueName: \"kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.742254 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.844131 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd89\" (UniqueName: \"kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.844179 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.844230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.848666 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.859409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.863861 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcd89\" (UniqueName: \"kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b66qm\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:12:59 crc kubenswrapper[4954]: I1127 17:12:59.954148 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:13:00 crc kubenswrapper[4954]: I1127 17:13:00.509709 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm"] Nov 27 17:13:00 crc kubenswrapper[4954]: I1127 17:13:00.532390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" event={"ID":"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd","Type":"ContainerStarted","Data":"8e3a7c209eedd9a32463344899ec06daf010ca80d90cb9dfa3eb23a15e0a59b5"} Nov 27 17:13:01 crc kubenswrapper[4954]: I1127 17:13:01.544269 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" event={"ID":"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd","Type":"ContainerStarted","Data":"708a85abeea323751f0f2debce72b37bd7710531cf02b0c2b304f1d330136313"} Nov 27 17:13:01 crc kubenswrapper[4954]: I1127 17:13:01.562868 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" podStartSLOduration=2.384415221 podStartE2EDuration="2.562855332s" podCreationTimestamp="2025-11-27 17:12:59 +0000 UTC" firstStartedPulling="2025-11-27 17:13:00.514422037 +0000 UTC m=+2092.531862337" lastFinishedPulling="2025-11-27 17:13:00.692862158 +0000 UTC m=+2092.710302448" observedRunningTime="2025-11-27 17:13:01.562291439 +0000 UTC m=+2093.579731739" watchObservedRunningTime="2025-11-27 17:13:01.562855332 +0000 UTC m=+2093.580295632" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.145145 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.147951 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.166046 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.254762 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.255204 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq484\" (UniqueName: \"kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.255482 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.357375 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.357440 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq484\" (UniqueName: \"kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.357544 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.357850 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.358013 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.379761 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq484\" (UniqueName: \"kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484\") pod \"redhat-operators-c2rbr\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.482217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:05 crc kubenswrapper[4954]: I1127 17:13:05.975746 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:06 crc kubenswrapper[4954]: I1127 17:13:06.593306 4954 generic.go:334] "Generic (PLEG): container finished" podID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerID="bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e" exitCode=0 Nov 27 17:13:06 crc kubenswrapper[4954]: I1127 17:13:06.593707 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerDied","Data":"bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e"} Nov 27 17:13:06 crc kubenswrapper[4954]: I1127 17:13:06.593735 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerStarted","Data":"812b92189786f6b1b0866b9a146e4236664622efb548a63dc50067b3b0b804b9"} Nov 27 17:13:06 crc kubenswrapper[4954]: I1127 17:13:06.596888 4954 generic.go:334] "Generic (PLEG): container finished" podID="655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" containerID="708a85abeea323751f0f2debce72b37bd7710531cf02b0c2b304f1d330136313" exitCode=0 Nov 27 17:13:06 crc kubenswrapper[4954]: I1127 17:13:06.596946 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" event={"ID":"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd","Type":"ContainerDied","Data":"708a85abeea323751f0f2debce72b37bd7710531cf02b0c2b304f1d330136313"} Nov 27 17:13:07 crc kubenswrapper[4954]: I1127 17:13:07.610146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerStarted","Data":"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af"} Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.012500 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.117870 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key\") pod \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.117990 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory\") pod \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.118201 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcd89\" (UniqueName: \"kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89\") pod \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\" (UID: \"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd\") " Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.129231 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89" (OuterVolumeSpecName: "kube-api-access-wcd89") pod "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" (UID: "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd"). InnerVolumeSpecName "kube-api-access-wcd89". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.146933 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory" (OuterVolumeSpecName: "inventory") pod "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" (UID: "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.147471 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" (UID: "655b8641-7aaf-4f45-b8a0-b23fbbfa3abd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.221300 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcd89\" (UniqueName: \"kubernetes.io/projected/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-kube-api-access-wcd89\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.221349 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.221362 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655b8641-7aaf-4f45-b8a0-b23fbbfa3abd-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.620194 4954 generic.go:334] "Generic (PLEG): container finished" podID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerID="70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af" exitCode=0 Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.620250 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerDied","Data":"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af"} Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.621990 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" event={"ID":"655b8641-7aaf-4f45-b8a0-b23fbbfa3abd","Type":"ContainerDied","Data":"8e3a7c209eedd9a32463344899ec06daf010ca80d90cb9dfa3eb23a15e0a59b5"} Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.622024 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3a7c209eedd9a32463344899ec06daf010ca80d90cb9dfa3eb23a15e0a59b5" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.622104 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b66qm" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.727937 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6"] Nov 27 17:13:08 crc kubenswrapper[4954]: E1127 17:13:08.728752 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.728772 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.729056 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="655b8641-7aaf-4f45-b8a0-b23fbbfa3abd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.730006 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.734655 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.736369 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.736443 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.736538 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.758640 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6"] Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.834224 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vhll\" (UniqueName: \"kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.834315 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.834339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.937139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhll\" (UniqueName: \"kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.937286 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.937317 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.942869 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.950597 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:08 crc kubenswrapper[4954]: I1127 17:13:08.954236 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vhll\" (UniqueName: \"kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5sdh6\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:09 crc kubenswrapper[4954]: I1127 17:13:09.062490 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:09 crc kubenswrapper[4954]: I1127 17:13:09.601065 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6"] Nov 27 17:13:09 crc kubenswrapper[4954]: W1127 17:13:09.602150 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7832bff_0ac7_4654_8277_92b9d5c04aa0.slice/crio-01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0 WatchSource:0}: Error finding container 01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0: Status 404 returned error can't find the container with id 01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0 Nov 27 17:13:09 crc kubenswrapper[4954]: I1127 17:13:09.636389 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerStarted","Data":"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99"} Nov 27 17:13:09 crc kubenswrapper[4954]: I1127 17:13:09.638926 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" event={"ID":"d7832bff-0ac7-4654-8277-92b9d5c04aa0","Type":"ContainerStarted","Data":"01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0"} Nov 27 17:13:09 crc kubenswrapper[4954]: I1127 17:13:09.660928 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2rbr" podStartSLOduration=2.044086179 podStartE2EDuration="4.660896306s" podCreationTimestamp="2025-11-27 17:13:05 +0000 UTC" firstStartedPulling="2025-11-27 17:13:06.595923696 +0000 UTC m=+2098.613363996" lastFinishedPulling="2025-11-27 17:13:09.212733833 +0000 UTC m=+2101.230174123" observedRunningTime="2025-11-27 17:13:09.655803063 +0000 UTC m=+2101.673243363" watchObservedRunningTime="2025-11-27 17:13:09.660896306 +0000 UTC m=+2101.678336616" Nov 27 17:13:10 crc kubenswrapper[4954]: I1127 17:13:10.649714 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" event={"ID":"d7832bff-0ac7-4654-8277-92b9d5c04aa0","Type":"ContainerStarted","Data":"371cb812f1418d3773f6ae0ff6fc690a18aa913cdc7d4d63d90b76f8bff65dac"} Nov 27 17:13:10 crc kubenswrapper[4954]: I1127 17:13:10.668124 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" podStartSLOduration=2.491005952 podStartE2EDuration="2.668091248s" podCreationTimestamp="2025-11-27 17:13:08 +0000 UTC" firstStartedPulling="2025-11-27 17:13:09.604679205 +0000 UTC m=+2101.622119505" lastFinishedPulling="2025-11-27 17:13:09.781764491 +0000 UTC m=+2101.799204801" observedRunningTime="2025-11-27 17:13:10.664231746 +0000 UTC m=+2102.681672046" watchObservedRunningTime="2025-11-27 17:13:10.668091248 +0000 UTC m=+2102.685531558" Nov 27 17:13:15 crc kubenswrapper[4954]: I1127 17:13:15.483091 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:15 crc kubenswrapper[4954]: I1127 17:13:15.483660 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:15 crc kubenswrapper[4954]: I1127 17:13:15.529674 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:15 crc kubenswrapper[4954]: I1127 17:13:15.761670 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:16 crc kubenswrapper[4954]: I1127 17:13:16.753441 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:17 crc kubenswrapper[4954]: I1127 17:13:17.722773 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2rbr" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="registry-server" containerID="cri-o://75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99" gracePeriod=2 Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.251569 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.415135 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content\") pod \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.415217 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq484\" (UniqueName: \"kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484\") pod \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.415404 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities\") pod \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\" (UID: \"f86bcf10-2e30-45ae-9b77-2bdedde3572e\") " Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.417084 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities" (OuterVolumeSpecName: "utilities") pod "f86bcf10-2e30-45ae-9b77-2bdedde3572e" (UID: "f86bcf10-2e30-45ae-9b77-2bdedde3572e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.421730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484" (OuterVolumeSpecName: "kube-api-access-nq484") pod "f86bcf10-2e30-45ae-9b77-2bdedde3572e" (UID: "f86bcf10-2e30-45ae-9b77-2bdedde3572e"). InnerVolumeSpecName "kube-api-access-nq484". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.518225 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.518277 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq484\" (UniqueName: \"kubernetes.io/projected/f86bcf10-2e30-45ae-9b77-2bdedde3572e-kube-api-access-nq484\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.524464 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f86bcf10-2e30-45ae-9b77-2bdedde3572e" (UID: "f86bcf10-2e30-45ae-9b77-2bdedde3572e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.619969 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86bcf10-2e30-45ae-9b77-2bdedde3572e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.735953 4954 generic.go:334] "Generic (PLEG): container finished" podID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerID="75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99" exitCode=0 Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.736129 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rbr" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.736335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerDied","Data":"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99"} Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.736446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rbr" event={"ID":"f86bcf10-2e30-45ae-9b77-2bdedde3572e","Type":"ContainerDied","Data":"812b92189786f6b1b0866b9a146e4236664622efb548a63dc50067b3b0b804b9"} Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.736572 4954 scope.go:117] "RemoveContainer" containerID="75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.772637 4954 scope.go:117] "RemoveContainer" containerID="70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.776730 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.793679 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2rbr"] Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.798761 4954 scope.go:117] "RemoveContainer" containerID="bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.870819 4954 scope.go:117] "RemoveContainer" containerID="75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99" Nov 27 17:13:18 crc kubenswrapper[4954]: E1127 17:13:18.871649 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99\": container with ID starting with 75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99 not found: ID does not exist" containerID="75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.871702 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99"} err="failed to get container status \"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99\": rpc error: code = NotFound desc = could not find container \"75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99\": container with ID starting with 75af0caf5c6eaeb48e95e6d87fc3c736600b7352020742b98445312ce7ec9f99 not found: ID does not exist" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.871748 4954 scope.go:117] "RemoveContainer" containerID="70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af" Nov 27 17:13:18 crc kubenswrapper[4954]: E1127 17:13:18.872250 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af\": container with ID starting with 70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af not found: ID does not exist" containerID="70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.872375 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af"} err="failed to get container status \"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af\": rpc error: code = NotFound desc = could not find container \"70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af\": container with ID starting with 70fe51a162d1558afbad8f702ae66a513872acead002ec5ee9a6f3005674d7af not found: ID does not exist" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.872478 4954 scope.go:117] "RemoveContainer" containerID="bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e" Nov 27 17:13:18 crc kubenswrapper[4954]: E1127 17:13:18.872917 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e\": container with ID starting with bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e not found: ID does not exist" containerID="bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e" Nov 27 17:13:18 crc kubenswrapper[4954]: I1127 17:13:18.872944 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e"} err="failed to get container status \"bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e\": rpc error: code = NotFound desc = could not find container \"bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e\": container with ID starting with bfd89e13809ab4c07aef2bfcc8ad6dbd9b1aab8c589229d802b829341617c60e not found: ID does not exist" Nov 27 17:13:20 crc kubenswrapper[4954]: I1127 17:13:20.673102 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" path="/var/lib/kubelet/pods/f86bcf10-2e30-45ae-9b77-2bdedde3572e/volumes" Nov 27 17:13:23 crc kubenswrapper[4954]: I1127 17:13:23.687751 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:13:23 crc kubenswrapper[4954]: I1127 17:13:23.688313 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:13:46 crc kubenswrapper[4954]: I1127 17:13:46.426192 4954 scope.go:117] "RemoveContainer" containerID="f32f179ed7c02cdbc2c6dbb8c6b36316cf799405042c286785ca98bf2a4c25ba" Nov 27 17:13:49 crc kubenswrapper[4954]: I1127 17:13:49.007025 4954 generic.go:334] "Generic (PLEG): container finished" podID="d7832bff-0ac7-4654-8277-92b9d5c04aa0" containerID="371cb812f1418d3773f6ae0ff6fc690a18aa913cdc7d4d63d90b76f8bff65dac" exitCode=0 Nov 27 17:13:49 crc kubenswrapper[4954]: I1127 17:13:49.007226 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" event={"ID":"d7832bff-0ac7-4654-8277-92b9d5c04aa0","Type":"ContainerDied","Data":"371cb812f1418d3773f6ae0ff6fc690a18aa913cdc7d4d63d90b76f8bff65dac"} Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.464184 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.631242 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory\") pod \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.631417 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key\") pod \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.631466 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vhll\" (UniqueName: \"kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll\") pod \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\" (UID: \"d7832bff-0ac7-4654-8277-92b9d5c04aa0\") " Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.644517 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll" (OuterVolumeSpecName: "kube-api-access-5vhll") pod "d7832bff-0ac7-4654-8277-92b9d5c04aa0" (UID: "d7832bff-0ac7-4654-8277-92b9d5c04aa0"). InnerVolumeSpecName "kube-api-access-5vhll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.664486 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7832bff-0ac7-4654-8277-92b9d5c04aa0" (UID: "d7832bff-0ac7-4654-8277-92b9d5c04aa0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.671780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory" (OuterVolumeSpecName: "inventory") pod "d7832bff-0ac7-4654-8277-92b9d5c04aa0" (UID: "d7832bff-0ac7-4654-8277-92b9d5c04aa0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.735623 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.735660 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vhll\" (UniqueName: \"kubernetes.io/projected/d7832bff-0ac7-4654-8277-92b9d5c04aa0-kube-api-access-5vhll\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:50 crc kubenswrapper[4954]: I1127 17:13:50.735679 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7832bff-0ac7-4654-8277-92b9d5c04aa0-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.032886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" event={"ID":"d7832bff-0ac7-4654-8277-92b9d5c04aa0","Type":"ContainerDied","Data":"01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0"} Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.033271 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01da3513961a031cdbe3b87611914a47a8905312f59689b650510e9d49a97ab0" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.033377 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5sdh6" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110333 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp"] Nov 27 17:13:51 crc kubenswrapper[4954]: E1127 17:13:51.110701 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="extract-content" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110720 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="extract-content" Nov 27 17:13:51 crc kubenswrapper[4954]: E1127 17:13:51.110735 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="registry-server" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110742 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="registry-server" Nov 27 17:13:51 crc kubenswrapper[4954]: E1127 17:13:51.110753 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="extract-utilities" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110759 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="extract-utilities" Nov 27 17:13:51 crc kubenswrapper[4954]: E1127 17:13:51.110776 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7832bff-0ac7-4654-8277-92b9d5c04aa0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110785 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7832bff-0ac7-4654-8277-92b9d5c04aa0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.110983 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86bcf10-2e30-45ae-9b77-2bdedde3572e" containerName="registry-server" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.111007 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7832bff-0ac7-4654-8277-92b9d5c04aa0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.111759 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.113972 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.114100 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.115174 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.115978 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.128446 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp"] Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.245463 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vfv\" (UniqueName: \"kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.245533 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.245570 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.347299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vfv\" (UniqueName: \"kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.347357 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.347383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.352948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.364117 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.365496 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vfv\" (UniqueName: \"kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.444398 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.945209 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp"] Nov 27 17:13:51 crc kubenswrapper[4954]: I1127 17:13:51.954343 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:13:52 crc kubenswrapper[4954]: I1127 17:13:52.045710 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" event={"ID":"5e3f28f3-6e95-438e-ba6e-587578b29bf9","Type":"ContainerStarted","Data":"28daa8a3760fe2886fa61c236265eda2a3fe3f60079f45caf1a4ce04c380ceda"} Nov 27 17:13:53 crc kubenswrapper[4954]: I1127 17:13:53.058673 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" event={"ID":"5e3f28f3-6e95-438e-ba6e-587578b29bf9","Type":"ContainerStarted","Data":"90347f8c30a6d3ac19724e9881bf6525c2b0a8c4a2f9d522c95387aa27ff2f49"} Nov 27 17:13:53 crc kubenswrapper[4954]: I1127 17:13:53.085199 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" podStartSLOduration=1.927067595 podStartE2EDuration="2.085172926s" podCreationTimestamp="2025-11-27 17:13:51 +0000 UTC" firstStartedPulling="2025-11-27 17:13:51.954027684 +0000 UTC m=+2143.971467984" lastFinishedPulling="2025-11-27 17:13:52.112133015 +0000 UTC m=+2144.129573315" observedRunningTime="2025-11-27 17:13:53.076882287 +0000 UTC m=+2145.094322587" watchObservedRunningTime="2025-11-27 17:13:53.085172926 +0000 UTC m=+2145.102613226" Nov 27 17:13:53 crc kubenswrapper[4954]: I1127 17:13:53.687180 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:13:53 crc kubenswrapper[4954]: I1127 17:13:53.687778 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:14:23 crc kubenswrapper[4954]: I1127 17:14:23.687193 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:14:23 crc kubenswrapper[4954]: I1127 17:14:23.690023 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:14:23 crc kubenswrapper[4954]: I1127 17:14:23.690208 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:14:23 crc kubenswrapper[4954]: I1127 17:14:23.691670 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:14:23 crc kubenswrapper[4954]: I1127 17:14:23.691868 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8" gracePeriod=600 Nov 27 17:14:23 crc kubenswrapper[4954]: E1127 17:14:23.931810 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a80574_7c60_4f19_985b_3ee313cb7bcd.slice/crio-conmon-3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a80574_7c60_4f19_985b_3ee313cb7bcd.slice/crio-3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:14:24 crc kubenswrapper[4954]: I1127 17:14:24.401543 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8" exitCode=0 Nov 27 17:14:24 crc kubenswrapper[4954]: I1127 17:14:24.401615 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8"} Nov 27 17:14:24 crc kubenswrapper[4954]: I1127 17:14:24.402104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6"} Nov 27 17:14:24 crc kubenswrapper[4954]: I1127 17:14:24.402135 4954 scope.go:117] "RemoveContainer" containerID="c634fc970f090ade11e9bb4461f26ec0209fb2640ae3e49bf1ab5c91d77dcc8f" Nov 27 17:14:46 crc kubenswrapper[4954]: I1127 17:14:46.656382 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e3f28f3-6e95-438e-ba6e-587578b29bf9" containerID="90347f8c30a6d3ac19724e9881bf6525c2b0a8c4a2f9d522c95387aa27ff2f49" exitCode=0 Nov 27 17:14:46 crc kubenswrapper[4954]: I1127 17:14:46.656467 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" event={"ID":"5e3f28f3-6e95-438e-ba6e-587578b29bf9","Type":"ContainerDied","Data":"90347f8c30a6d3ac19724e9881bf6525c2b0a8c4a2f9d522c95387aa27ff2f49"} Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.112271 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.193083 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key\") pod \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.193277 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory\") pod \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.193442 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2vfv\" (UniqueName: \"kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv\") pod \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\" (UID: \"5e3f28f3-6e95-438e-ba6e-587578b29bf9\") " Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.201134 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv" (OuterVolumeSpecName: "kube-api-access-t2vfv") pod "5e3f28f3-6e95-438e-ba6e-587578b29bf9" (UID: "5e3f28f3-6e95-438e-ba6e-587578b29bf9"). InnerVolumeSpecName "kube-api-access-t2vfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.224123 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory" (OuterVolumeSpecName: "inventory") pod "5e3f28f3-6e95-438e-ba6e-587578b29bf9" (UID: "5e3f28f3-6e95-438e-ba6e-587578b29bf9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.225353 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e3f28f3-6e95-438e-ba6e-587578b29bf9" (UID: "5e3f28f3-6e95-438e-ba6e-587578b29bf9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.296308 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.296895 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e3f28f3-6e95-438e-ba6e-587578b29bf9-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.296959 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2vfv\" (UniqueName: \"kubernetes.io/projected/5e3f28f3-6e95-438e-ba6e-587578b29bf9-kube-api-access-t2vfv\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.683694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" event={"ID":"5e3f28f3-6e95-438e-ba6e-587578b29bf9","Type":"ContainerDied","Data":"28daa8a3760fe2886fa61c236265eda2a3fe3f60079f45caf1a4ce04c380ceda"} Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.683768 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28daa8a3760fe2886fa61c236265eda2a3fe3f60079f45caf1a4ce04c380ceda" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.683844 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.802253 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xgjv5"] Nov 27 17:14:48 crc kubenswrapper[4954]: E1127 17:14:48.802753 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3f28f3-6e95-438e-ba6e-587578b29bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.802783 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3f28f3-6e95-438e-ba6e-587578b29bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.803041 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3f28f3-6e95-438e-ba6e-587578b29bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.803696 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.806258 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.806478 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.806508 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.810852 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.814284 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xgjv5"] Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.911897 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8rrz\" (UniqueName: \"kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.912308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:48 crc kubenswrapper[4954]: I1127 17:14:48.912567 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.013883 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8rrz\" (UniqueName: \"kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.013988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.014045 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.018809 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.023177 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.039322 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8rrz\" (UniqueName: \"kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz\") pod \"ssh-known-hosts-edpm-deployment-xgjv5\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.123396 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:49 crc kubenswrapper[4954]: I1127 17:14:49.697663 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xgjv5"] Nov 27 17:14:50 crc kubenswrapper[4954]: I1127 17:14:50.716572 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" event={"ID":"694335d5-113f-4c2b-ab58-22fc7b866e46","Type":"ContainerStarted","Data":"1616d9f218b7f8780eacf5438df7874b189f54c02908758786ae08c1ee7c7b82"} Nov 27 17:14:50 crc kubenswrapper[4954]: I1127 17:14:50.717749 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" event={"ID":"694335d5-113f-4c2b-ab58-22fc7b866e46","Type":"ContainerStarted","Data":"8fe146622ce21c41069810e4acbede3726834b3c365ed2244bb6815220a535cd"} Nov 27 17:14:57 crc kubenswrapper[4954]: I1127 17:14:57.815396 4954 generic.go:334] "Generic (PLEG): container finished" podID="694335d5-113f-4c2b-ab58-22fc7b866e46" containerID="1616d9f218b7f8780eacf5438df7874b189f54c02908758786ae08c1ee7c7b82" exitCode=0 Nov 27 17:14:57 crc kubenswrapper[4954]: I1127 17:14:57.815494 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" event={"ID":"694335d5-113f-4c2b-ab58-22fc7b866e46","Type":"ContainerDied","Data":"1616d9f218b7f8780eacf5438df7874b189f54c02908758786ae08c1ee7c7b82"} Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.296389 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.359365 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8rrz\" (UniqueName: \"kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz\") pod \"694335d5-113f-4c2b-ab58-22fc7b866e46\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.360626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0\") pod \"694335d5-113f-4c2b-ab58-22fc7b866e46\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.360779 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam\") pod \"694335d5-113f-4c2b-ab58-22fc7b866e46\" (UID: \"694335d5-113f-4c2b-ab58-22fc7b866e46\") " Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.367772 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz" (OuterVolumeSpecName: "kube-api-access-x8rrz") pod "694335d5-113f-4c2b-ab58-22fc7b866e46" (UID: "694335d5-113f-4c2b-ab58-22fc7b866e46"). InnerVolumeSpecName "kube-api-access-x8rrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.393524 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "694335d5-113f-4c2b-ab58-22fc7b866e46" (UID: "694335d5-113f-4c2b-ab58-22fc7b866e46"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.396957 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "694335d5-113f-4c2b-ab58-22fc7b866e46" (UID: "694335d5-113f-4c2b-ab58-22fc7b866e46"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.463955 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8rrz\" (UniqueName: \"kubernetes.io/projected/694335d5-113f-4c2b-ab58-22fc7b866e46-kube-api-access-x8rrz\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.464018 4954 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.464042 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/694335d5-113f-4c2b-ab58-22fc7b866e46-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.840309 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" event={"ID":"694335d5-113f-4c2b-ab58-22fc7b866e46","Type":"ContainerDied","Data":"8fe146622ce21c41069810e4acbede3726834b3c365ed2244bb6815220a535cd"} Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.840391 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe146622ce21c41069810e4acbede3726834b3c365ed2244bb6815220a535cd" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.840428 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xgjv5" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.951959 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884"] Nov 27 17:14:59 crc kubenswrapper[4954]: E1127 17:14:59.952494 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694335d5-113f-4c2b-ab58-22fc7b866e46" containerName="ssh-known-hosts-edpm-deployment" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.952522 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="694335d5-113f-4c2b-ab58-22fc7b866e46" containerName="ssh-known-hosts-edpm-deployment" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.952796 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="694335d5-113f-4c2b-ab58-22fc7b866e46" containerName="ssh-known-hosts-edpm-deployment" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.953621 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.956271 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.956406 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.956645 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.956964 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:14:59 crc kubenswrapper[4954]: I1127 17:14:59.986753 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884"] Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.080769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.080911 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.081202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmr4\" (UniqueName: \"kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.142759 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh"] Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.144572 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.149444 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.151089 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.160597 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh"] Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.183620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmr4\" (UniqueName: \"kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.183723 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.183775 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.187527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.195840 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.207741 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmr4\" (UniqueName: \"kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6l884\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.284218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.288087 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.288313 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcf45\" (UniqueName: \"kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.288521 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.390063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.390608 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.390678 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcf45\" (UniqueName: \"kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.391770 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.394803 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.411189 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcf45\" (UniqueName: \"kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45\") pod \"collect-profiles-29404395-cm5sh\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.480382 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.790797 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh"] Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.855903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" event={"ID":"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad","Type":"ContainerStarted","Data":"b4fd054ab55a772989a376bc0a6d93bc7b766e940ee37adbd508a3353a1ef615"} Nov 27 17:15:00 crc kubenswrapper[4954]: I1127 17:15:00.872661 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884"] Nov 27 17:15:01 crc kubenswrapper[4954]: I1127 17:15:01.867181 4954 generic.go:334] "Generic (PLEG): container finished" podID="f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" containerID="fe25ae3d3d24f6b8f0c9d03d0bbfba368b06f534d2f9965e58684af5adaa4be6" exitCode=0 Nov 27 17:15:01 crc kubenswrapper[4954]: I1127 17:15:01.867272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" event={"ID":"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad","Type":"ContainerDied","Data":"fe25ae3d3d24f6b8f0c9d03d0bbfba368b06f534d2f9965e58684af5adaa4be6"} Nov 27 17:15:01 crc kubenswrapper[4954]: I1127 17:15:01.869830 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" event={"ID":"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678","Type":"ContainerStarted","Data":"22ee0e51b7698b72217427d189767027780424546c006b879c1d9b16dee76236"} Nov 27 17:15:01 crc kubenswrapper[4954]: I1127 17:15:01.869883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" event={"ID":"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678","Type":"ContainerStarted","Data":"d1eb4d949dfd1f5e3362b638417e1c53222a2099855debb6f9d12f17d1d10649"} Nov 27 17:15:01 crc kubenswrapper[4954]: I1127 17:15:01.913151 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" podStartSLOduration=2.721358544 podStartE2EDuration="2.913131164s" podCreationTimestamp="2025-11-27 17:14:59 +0000 UTC" firstStartedPulling="2025-11-27 17:15:00.908710589 +0000 UTC m=+2212.926150899" lastFinishedPulling="2025-11-27 17:15:01.100483219 +0000 UTC m=+2213.117923519" observedRunningTime="2025-11-27 17:15:01.90798667 +0000 UTC m=+2213.925426970" watchObservedRunningTime="2025-11-27 17:15:01.913131164 +0000 UTC m=+2213.930571464" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.227241 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.364745 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcf45\" (UniqueName: \"kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45\") pod \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.364816 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume\") pod \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.365087 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume\") pod \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\" (UID: \"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad\") " Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.366130 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume" (OuterVolumeSpecName: "config-volume") pod "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" (UID: "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.372314 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45" (OuterVolumeSpecName: "kube-api-access-lcf45") pod "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" (UID: "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad"). InnerVolumeSpecName "kube-api-access-lcf45". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.372898 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" (UID: "f9f6edf8-5c9b-4630-903f-4fc8f11ebdad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.468836 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.468917 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcf45\" (UniqueName: \"kubernetes.io/projected/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-kube-api-access-lcf45\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.468942 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f6edf8-5c9b-4630-903f-4fc8f11ebdad-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.888643 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" event={"ID":"f9f6edf8-5c9b-4630-903f-4fc8f11ebdad","Type":"ContainerDied","Data":"b4fd054ab55a772989a376bc0a6d93bc7b766e940ee37adbd508a3353a1ef615"} Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.888700 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4fd054ab55a772989a376bc0a6d93bc7b766e940ee37adbd508a3353a1ef615" Nov 27 17:15:03 crc kubenswrapper[4954]: I1127 17:15:03.888726 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-cm5sh" Nov 27 17:15:04 crc kubenswrapper[4954]: I1127 17:15:04.302353 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz"] Nov 27 17:15:04 crc kubenswrapper[4954]: I1127 17:15:04.309703 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-52bhz"] Nov 27 17:15:04 crc kubenswrapper[4954]: I1127 17:15:04.679901 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa" path="/var/lib/kubelet/pods/d7d46e33-6ea4-4bcf-bd5f-2e70c3fcdeaa/volumes" Nov 27 17:15:11 crc kubenswrapper[4954]: I1127 17:15:11.968248 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" containerID="22ee0e51b7698b72217427d189767027780424546c006b879c1d9b16dee76236" exitCode=0 Nov 27 17:15:11 crc kubenswrapper[4954]: I1127 17:15:11.968381 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" event={"ID":"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678","Type":"ContainerDied","Data":"22ee0e51b7698b72217427d189767027780424546c006b879c1d9b16dee76236"} Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.394047 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.492637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrmr4\" (UniqueName: \"kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4\") pod \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.492768 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory\") pod \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.492806 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key\") pod \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\" (UID: \"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678\") " Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.499851 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4" (OuterVolumeSpecName: "kube-api-access-lrmr4") pod "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" (UID: "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678"). InnerVolumeSpecName "kube-api-access-lrmr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.521962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory" (OuterVolumeSpecName: "inventory") pod "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" (UID: "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.526785 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" (UID: "cfb3cf23-1ad0-47ac-af59-8b8ae7e79678"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.596193 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.596247 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrmr4\" (UniqueName: \"kubernetes.io/projected/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-kube-api-access-lrmr4\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.596262 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb3cf23-1ad0-47ac-af59-8b8ae7e79678-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.991386 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.991278 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6l884" event={"ID":"cfb3cf23-1ad0-47ac-af59-8b8ae7e79678","Type":"ContainerDied","Data":"d1eb4d949dfd1f5e3362b638417e1c53222a2099855debb6f9d12f17d1d10649"} Nov 27 17:15:13 crc kubenswrapper[4954]: I1127 17:15:13.993881 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1eb4d949dfd1f5e3362b638417e1c53222a2099855debb6f9d12f17d1d10649" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.060220 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf"] Nov 27 17:15:14 crc kubenswrapper[4954]: E1127 17:15:14.060701 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.060732 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:14 crc kubenswrapper[4954]: E1127 17:15:14.060759 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" containerName="collect-profiles" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.060767 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" containerName="collect-profiles" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.060974 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb3cf23-1ad0-47ac-af59-8b8ae7e79678" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.061006 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f6edf8-5c9b-4630-903f-4fc8f11ebdad" containerName="collect-profiles" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.061848 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.064199 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.064987 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.065143 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.065337 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.077356 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf"] Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.207546 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdh5\" (UniqueName: \"kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.207685 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.207760 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.309147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdh5\" (UniqueName: \"kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.310390 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.310517 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.314626 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.315032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.330564 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdh5\" (UniqueName: \"kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:14 crc kubenswrapper[4954]: I1127 17:15:14.383309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:15 crc kubenswrapper[4954]: I1127 17:15:15.118115 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf"] Nov 27 17:15:16 crc kubenswrapper[4954]: I1127 17:15:16.034260 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" event={"ID":"6e2def23-1765-4015-b698-c2b8516a6f18","Type":"ContainerStarted","Data":"19ec569ac763a3f891e38b0377e6c3df0882739442d827916ced3c92f9b91975"} Nov 27 17:15:16 crc kubenswrapper[4954]: I1127 17:15:16.034924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" event={"ID":"6e2def23-1765-4015-b698-c2b8516a6f18","Type":"ContainerStarted","Data":"6655e779b97c5dfea65181d7a537130df2ddfd36dc65bff89a8383c1e64a03ce"} Nov 27 17:15:16 crc kubenswrapper[4954]: I1127 17:15:16.062695 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" podStartSLOduration=1.8865169480000001 podStartE2EDuration="2.062672543s" podCreationTimestamp="2025-11-27 17:15:14 +0000 UTC" firstStartedPulling="2025-11-27 17:15:15.102357177 +0000 UTC m=+2227.119797477" lastFinishedPulling="2025-11-27 17:15:15.278512772 +0000 UTC m=+2227.295953072" observedRunningTime="2025-11-27 17:15:16.050138022 +0000 UTC m=+2228.067578322" watchObservedRunningTime="2025-11-27 17:15:16.062672543 +0000 UTC m=+2228.080112843" Nov 27 17:15:26 crc kubenswrapper[4954]: I1127 17:15:26.156748 4954 generic.go:334] "Generic (PLEG): container finished" podID="6e2def23-1765-4015-b698-c2b8516a6f18" containerID="19ec569ac763a3f891e38b0377e6c3df0882739442d827916ced3c92f9b91975" exitCode=0 Nov 27 17:15:26 crc kubenswrapper[4954]: I1127 17:15:26.156863 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" event={"ID":"6e2def23-1765-4015-b698-c2b8516a6f18","Type":"ContainerDied","Data":"19ec569ac763a3f891e38b0377e6c3df0882739442d827916ced3c92f9b91975"} Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.601538 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.728218 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key\") pod \"6e2def23-1765-4015-b698-c2b8516a6f18\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.728289 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory\") pod \"6e2def23-1765-4015-b698-c2b8516a6f18\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.728531 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdh5\" (UniqueName: \"kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5\") pod \"6e2def23-1765-4015-b698-c2b8516a6f18\" (UID: \"6e2def23-1765-4015-b698-c2b8516a6f18\") " Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.734853 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5" (OuterVolumeSpecName: "kube-api-access-szdh5") pod "6e2def23-1765-4015-b698-c2b8516a6f18" (UID: "6e2def23-1765-4015-b698-c2b8516a6f18"). InnerVolumeSpecName "kube-api-access-szdh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.764064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6e2def23-1765-4015-b698-c2b8516a6f18" (UID: "6e2def23-1765-4015-b698-c2b8516a6f18"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.767804 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory" (OuterVolumeSpecName: "inventory") pod "6e2def23-1765-4015-b698-c2b8516a6f18" (UID: "6e2def23-1765-4015-b698-c2b8516a6f18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.834906 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szdh5\" (UniqueName: \"kubernetes.io/projected/6e2def23-1765-4015-b698-c2b8516a6f18-kube-api-access-szdh5\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.835243 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:27 crc kubenswrapper[4954]: I1127 17:15:27.835381 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e2def23-1765-4015-b698-c2b8516a6f18-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.189994 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" event={"ID":"6e2def23-1765-4015-b698-c2b8516a6f18","Type":"ContainerDied","Data":"6655e779b97c5dfea65181d7a537130df2ddfd36dc65bff89a8383c1e64a03ce"} Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.190063 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6655e779b97c5dfea65181d7a537130df2ddfd36dc65bff89a8383c1e64a03ce" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.190864 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.283425 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq"] Nov 27 17:15:28 crc kubenswrapper[4954]: E1127 17:15:28.283831 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2def23-1765-4015-b698-c2b8516a6f18" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.283856 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2def23-1765-4015-b698-c2b8516a6f18" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.284079 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2def23-1765-4015-b698-c2b8516a6f18" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.284763 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.289178 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.289227 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.289184 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.289975 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.289988 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.290889 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.294479 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.295371 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.324829 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq"] Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344088 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cxn\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344310 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344444 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344595 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344620 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344700 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.344972 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.446893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447122 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cxn\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447212 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447266 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447305 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447372 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447471 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447508 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447556 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.447639 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.452550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.453767 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.453993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.454336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.455909 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.456224 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.456278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.456985 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.457340 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.457492 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.457837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.458646 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.463251 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.468246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cxn\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:28 crc kubenswrapper[4954]: I1127 17:15:28.604959 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:15:29 crc kubenswrapper[4954]: I1127 17:15:29.171997 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq"] Nov 27 17:15:29 crc kubenswrapper[4954]: I1127 17:15:29.203366 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" event={"ID":"dbb8e909-5f3f-4076-b549-d489f37cd8e3","Type":"ContainerStarted","Data":"a1275a7580dff35d3de24a922419a980a21b8c8517019c2417505c38e8960aec"} Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.218811 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" event={"ID":"dbb8e909-5f3f-4076-b549-d489f37cd8e3","Type":"ContainerStarted","Data":"5e2a93f6dff74421eb3771b6be37fe056aa9dd66d5c1cf3594a8bcf2e82b91f2"} Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.243018 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" podStartSLOduration=2.088162272 podStartE2EDuration="2.242996294s" podCreationTimestamp="2025-11-27 17:15:28 +0000 UTC" firstStartedPulling="2025-11-27 17:15:29.176730601 +0000 UTC m=+2241.194170901" lastFinishedPulling="2025-11-27 17:15:29.331564623 +0000 UTC m=+2241.349004923" observedRunningTime="2025-11-27 17:15:30.237216135 +0000 UTC m=+2242.254656435" watchObservedRunningTime="2025-11-27 17:15:30.242996294 +0000 UTC m=+2242.260436594" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.265645 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.267723 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.283947 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.401604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.401875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d966\" (UniqueName: \"kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.402083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.504278 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.504345 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.504386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d966\" (UniqueName: \"kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.505203 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.505496 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.529008 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d966\" (UniqueName: \"kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966\") pod \"certified-operators-j56sc\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:30 crc kubenswrapper[4954]: I1127 17:15:30.603072 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:31 crc kubenswrapper[4954]: I1127 17:15:31.144156 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:31 crc kubenswrapper[4954]: W1127 17:15:31.149750 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1b3c60_ddc6_4616_9d5a_f77129477364.slice/crio-97213719603b2d15bd5059e43667690d7fa8748a19a529b19f40475e92ddb258 WatchSource:0}: Error finding container 97213719603b2d15bd5059e43667690d7fa8748a19a529b19f40475e92ddb258: Status 404 returned error can't find the container with id 97213719603b2d15bd5059e43667690d7fa8748a19a529b19f40475e92ddb258 Nov 27 17:15:31 crc kubenswrapper[4954]: I1127 17:15:31.229328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerStarted","Data":"97213719603b2d15bd5059e43667690d7fa8748a19a529b19f40475e92ddb258"} Nov 27 17:15:32 crc kubenswrapper[4954]: I1127 17:15:32.243076 4954 generic.go:334] "Generic (PLEG): container finished" podID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerID="d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48" exitCode=0 Nov 27 17:15:32 crc kubenswrapper[4954]: I1127 17:15:32.243348 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerDied","Data":"d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48"} Nov 27 17:15:33 crc kubenswrapper[4954]: I1127 17:15:33.253955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerStarted","Data":"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa"} Nov 27 17:15:34 crc kubenswrapper[4954]: I1127 17:15:34.269408 4954 generic.go:334] "Generic (PLEG): container finished" podID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerID="a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa" exitCode=0 Nov 27 17:15:34 crc kubenswrapper[4954]: I1127 17:15:34.269823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerDied","Data":"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa"} Nov 27 17:15:35 crc kubenswrapper[4954]: I1127 17:15:35.279637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerStarted","Data":"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f"} Nov 27 17:15:35 crc kubenswrapper[4954]: I1127 17:15:35.296967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j56sc" podStartSLOduration=2.786218162 podStartE2EDuration="5.296946619s" podCreationTimestamp="2025-11-27 17:15:30 +0000 UTC" firstStartedPulling="2025-11-27 17:15:32.24704425 +0000 UTC m=+2244.264484560" lastFinishedPulling="2025-11-27 17:15:34.757772717 +0000 UTC m=+2246.775213017" observedRunningTime="2025-11-27 17:15:35.293520967 +0000 UTC m=+2247.310961267" watchObservedRunningTime="2025-11-27 17:15:35.296946619 +0000 UTC m=+2247.314386929" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.029171 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.033409 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.042801 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.147714 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.147992 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.148133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx428\" (UniqueName: \"kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.250341 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.250457 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx428\" (UniqueName: \"kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.250560 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.251015 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.251058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.270562 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx428\" (UniqueName: \"kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428\") pod \"community-operators-8z6tq\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.366461 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:37 crc kubenswrapper[4954]: I1127 17:15:37.938351 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:38 crc kubenswrapper[4954]: I1127 17:15:38.308708 4954 generic.go:334] "Generic (PLEG): container finished" podID="58c388b6-f7c3-45dd-9394-634d626bc496" containerID="aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324" exitCode=0 Nov 27 17:15:38 crc kubenswrapper[4954]: I1127 17:15:38.308746 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerDied","Data":"aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324"} Nov 27 17:15:38 crc kubenswrapper[4954]: I1127 17:15:38.308854 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerStarted","Data":"ab27e6613a5418ef2b2c433270ad42245784a4c06c9ab1f18870daac55721f80"} Nov 27 17:15:40 crc kubenswrapper[4954]: I1127 17:15:40.329464 4954 generic.go:334] "Generic (PLEG): container finished" podID="58c388b6-f7c3-45dd-9394-634d626bc496" containerID="befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9" exitCode=0 Nov 27 17:15:40 crc kubenswrapper[4954]: I1127 17:15:40.329532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerDied","Data":"befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9"} Nov 27 17:15:40 crc kubenswrapper[4954]: I1127 17:15:40.603700 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:40 crc kubenswrapper[4954]: I1127 17:15:40.603854 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:40 crc kubenswrapper[4954]: I1127 17:15:40.656276 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:41 crc kubenswrapper[4954]: I1127 17:15:41.413191 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:41 crc kubenswrapper[4954]: I1127 17:15:41.819434 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:43 crc kubenswrapper[4954]: I1127 17:15:43.376121 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerStarted","Data":"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013"} Nov 27 17:15:43 crc kubenswrapper[4954]: I1127 17:15:43.376243 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j56sc" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="registry-server" containerID="cri-o://ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f" gracePeriod=2 Nov 27 17:15:43 crc kubenswrapper[4954]: I1127 17:15:43.422716 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8z6tq" podStartSLOduration=2.499896457 podStartE2EDuration="6.42268973s" podCreationTimestamp="2025-11-27 17:15:37 +0000 UTC" firstStartedPulling="2025-11-27 17:15:38.311336414 +0000 UTC m=+2250.328776714" lastFinishedPulling="2025-11-27 17:15:42.234129687 +0000 UTC m=+2254.251569987" observedRunningTime="2025-11-27 17:15:43.406300775 +0000 UTC m=+2255.423741075" watchObservedRunningTime="2025-11-27 17:15:43.42268973 +0000 UTC m=+2255.440130030" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.350959 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.388153 4954 generic.go:334] "Generic (PLEG): container finished" podID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerID="ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f" exitCode=0 Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.388212 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j56sc" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.388231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerDied","Data":"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f"} Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.390443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j56sc" event={"ID":"7d1b3c60-ddc6-4616-9d5a-f77129477364","Type":"ContainerDied","Data":"97213719603b2d15bd5059e43667690d7fa8748a19a529b19f40475e92ddb258"} Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.390537 4954 scope.go:117] "RemoveContainer" containerID="ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.420874 4954 scope.go:117] "RemoveContainer" containerID="a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.441716 4954 scope.go:117] "RemoveContainer" containerID="d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.481841 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content\") pod \"7d1b3c60-ddc6-4616-9d5a-f77129477364\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.481939 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities\") pod \"7d1b3c60-ddc6-4616-9d5a-f77129477364\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.482087 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d966\" (UniqueName: \"kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966\") pod \"7d1b3c60-ddc6-4616-9d5a-f77129477364\" (UID: \"7d1b3c60-ddc6-4616-9d5a-f77129477364\") " Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.485677 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities" (OuterVolumeSpecName: "utilities") pod "7d1b3c60-ddc6-4616-9d5a-f77129477364" (UID: "7d1b3c60-ddc6-4616-9d5a-f77129477364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.494768 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966" (OuterVolumeSpecName: "kube-api-access-5d966") pod "7d1b3c60-ddc6-4616-9d5a-f77129477364" (UID: "7d1b3c60-ddc6-4616-9d5a-f77129477364"). InnerVolumeSpecName "kube-api-access-5d966". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.494808 4954 scope.go:117] "RemoveContainer" containerID="ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f" Nov 27 17:15:44 crc kubenswrapper[4954]: E1127 17:15:44.495408 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f\": container with ID starting with ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f not found: ID does not exist" containerID="ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.495569 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f"} err="failed to get container status \"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f\": rpc error: code = NotFound desc = could not find container \"ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f\": container with ID starting with ab14d692049e8310012ab1aef666727dc8acf7341acb5c6f69ce15423384751f not found: ID does not exist" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.495626 4954 scope.go:117] "RemoveContainer" containerID="a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa" Nov 27 17:15:44 crc kubenswrapper[4954]: E1127 17:15:44.496389 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa\": container with ID starting with a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa not found: ID does not exist" containerID="a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.496418 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa"} err="failed to get container status \"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa\": rpc error: code = NotFound desc = could not find container \"a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa\": container with ID starting with a5cb87e88f40868f941754fbaacb981155cbcea6435980125ae3e8b6545aa6aa not found: ID does not exist" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.496436 4954 scope.go:117] "RemoveContainer" containerID="d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48" Nov 27 17:15:44 crc kubenswrapper[4954]: E1127 17:15:44.496930 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48\": container with ID starting with d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48 not found: ID does not exist" containerID="d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.496965 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48"} err="failed to get container status \"d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48\": rpc error: code = NotFound desc = could not find container \"d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48\": container with ID starting with d8c9d007c289dc9fe2b6c6ff0a473a1d6ff10f13c13a79fabd9ef40c886a7d48 not found: ID does not exist" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.536334 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d1b3c60-ddc6-4616-9d5a-f77129477364" (UID: "7d1b3c60-ddc6-4616-9d5a-f77129477364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.584861 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.584970 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d966\" (UniqueName: \"kubernetes.io/projected/7d1b3c60-ddc6-4616-9d5a-f77129477364-kube-api-access-5d966\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.584984 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1b3c60-ddc6-4616-9d5a-f77129477364-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.718674 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:44 crc kubenswrapper[4954]: I1127 17:15:44.725687 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j56sc"] Nov 27 17:15:46 crc kubenswrapper[4954]: I1127 17:15:46.540029 4954 scope.go:117] "RemoveContainer" containerID="9d834dbdd90a2ed8601aa0cf2877e09ac939740a736753205e09592479fa4681" Nov 27 17:15:46 crc kubenswrapper[4954]: I1127 17:15:46.675426 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" path="/var/lib/kubelet/pods/7d1b3c60-ddc6-4616-9d5a-f77129477364/volumes" Nov 27 17:15:47 crc kubenswrapper[4954]: I1127 17:15:47.367156 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:47 crc kubenswrapper[4954]: I1127 17:15:47.367231 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:47 crc kubenswrapper[4954]: I1127 17:15:47.428244 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:47 crc kubenswrapper[4954]: I1127 17:15:47.500718 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:48 crc kubenswrapper[4954]: I1127 17:15:48.229214 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:49 crc kubenswrapper[4954]: I1127 17:15:49.467784 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8z6tq" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="registry-server" containerID="cri-o://f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013" gracePeriod=2 Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.014016 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.121991 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities\") pod \"58c388b6-f7c3-45dd-9394-634d626bc496\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.122080 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content\") pod \"58c388b6-f7c3-45dd-9394-634d626bc496\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.122149 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx428\" (UniqueName: \"kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428\") pod \"58c388b6-f7c3-45dd-9394-634d626bc496\" (UID: \"58c388b6-f7c3-45dd-9394-634d626bc496\") " Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.123732 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities" (OuterVolumeSpecName: "utilities") pod "58c388b6-f7c3-45dd-9394-634d626bc496" (UID: "58c388b6-f7c3-45dd-9394-634d626bc496"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.132810 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428" (OuterVolumeSpecName: "kube-api-access-mx428") pod "58c388b6-f7c3-45dd-9394-634d626bc496" (UID: "58c388b6-f7c3-45dd-9394-634d626bc496"). InnerVolumeSpecName "kube-api-access-mx428". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.182469 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58c388b6-f7c3-45dd-9394-634d626bc496" (UID: "58c388b6-f7c3-45dd-9394-634d626bc496"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.225330 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.225658 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c388b6-f7c3-45dd-9394-634d626bc496-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.225729 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx428\" (UniqueName: \"kubernetes.io/projected/58c388b6-f7c3-45dd-9394-634d626bc496-kube-api-access-mx428\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.481630 4954 generic.go:334] "Generic (PLEG): container finished" podID="58c388b6-f7c3-45dd-9394-634d626bc496" containerID="f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013" exitCode=0 Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.481726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerDied","Data":"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013"} Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.481813 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z6tq" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.483239 4954 scope.go:117] "RemoveContainer" containerID="f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.483144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z6tq" event={"ID":"58c388b6-f7c3-45dd-9394-634d626bc496","Type":"ContainerDied","Data":"ab27e6613a5418ef2b2c433270ad42245784a4c06c9ab1f18870daac55721f80"} Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.532466 4954 scope.go:117] "RemoveContainer" containerID="befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.580491 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.585037 4954 scope.go:117] "RemoveContainer" containerID="aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.596645 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8z6tq"] Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.632843 4954 scope.go:117] "RemoveContainer" containerID="f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013" Nov 27 17:15:50 crc kubenswrapper[4954]: E1127 17:15:50.633871 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013\": container with ID starting with f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013 not found: ID does not exist" containerID="f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.633912 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013"} err="failed to get container status \"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013\": rpc error: code = NotFound desc = could not find container \"f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013\": container with ID starting with f11a61a1cdd3f00a5e5bd5b7ea2e31092f954290ba8607ee4a6519bf1e101013 not found: ID does not exist" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.633942 4954 scope.go:117] "RemoveContainer" containerID="befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9" Nov 27 17:15:50 crc kubenswrapper[4954]: E1127 17:15:50.634332 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9\": container with ID starting with befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9 not found: ID does not exist" containerID="befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.634451 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9"} err="failed to get container status \"befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9\": rpc error: code = NotFound desc = could not find container \"befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9\": container with ID starting with befcf2a0ada959e5e2fe31f3c85e321480d22ee8ed0c536e5dc9b40508acb8a9 not found: ID does not exist" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.634554 4954 scope.go:117] "RemoveContainer" containerID="aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324" Nov 27 17:15:50 crc kubenswrapper[4954]: E1127 17:15:50.634906 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324\": container with ID starting with aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324 not found: ID does not exist" containerID="aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.634936 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324"} err="failed to get container status \"aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324\": rpc error: code = NotFound desc = could not find container \"aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324\": container with ID starting with aa10e3f92739e516993c04ca1f8b787417a2387ec1b1f0388b859ed5adbd7324 not found: ID does not exist" Nov 27 17:15:50 crc kubenswrapper[4954]: I1127 17:15:50.678285 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" path="/var/lib/kubelet/pods/58c388b6-f7c3-45dd-9394-634d626bc496/volumes" Nov 27 17:16:12 crc kubenswrapper[4954]: I1127 17:16:12.701170 4954 generic.go:334] "Generic (PLEG): container finished" podID="dbb8e909-5f3f-4076-b549-d489f37cd8e3" containerID="5e2a93f6dff74421eb3771b6be37fe056aa9dd66d5c1cf3594a8bcf2e82b91f2" exitCode=0 Nov 27 17:16:12 crc kubenswrapper[4954]: I1127 17:16:12.701259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" event={"ID":"dbb8e909-5f3f-4076-b549-d489f37cd8e3","Type":"ContainerDied","Data":"5e2a93f6dff74421eb3771b6be37fe056aa9dd66d5c1cf3594a8bcf2e82b91f2"} Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.216847 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279020 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279100 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279125 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279223 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279266 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279342 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279384 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cxn\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279462 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279485 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279562 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.279620 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle\") pod \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\" (UID: \"dbb8e909-5f3f-4076-b549-d489f37cd8e3\") " Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.287651 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.287856 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.287864 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.288086 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.288426 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.289148 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.289150 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.289341 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.295365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn" (OuterVolumeSpecName: "kube-api-access-t7cxn") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "kube-api-access-t7cxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.297919 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.297964 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.298749 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.321035 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.331840 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory" (OuterVolumeSpecName: "inventory") pod "dbb8e909-5f3f-4076-b549-d489f37cd8e3" (UID: "dbb8e909-5f3f-4076-b549-d489f37cd8e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381358 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381397 4954 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381412 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381518 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381533 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381544 4954 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381558 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381658 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381670 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381682 4954 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381692 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381700 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbb8e909-5f3f-4076-b549-d489f37cd8e3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381710 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.381718 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cxn\" (UniqueName: \"kubernetes.io/projected/dbb8e909-5f3f-4076-b549-d489f37cd8e3-kube-api-access-t7cxn\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.727844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" event={"ID":"dbb8e909-5f3f-4076-b549-d489f37cd8e3","Type":"ContainerDied","Data":"a1275a7580dff35d3de24a922419a980a21b8c8517019c2417505c38e8960aec"} Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.727891 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1275a7580dff35d3de24a922419a980a21b8c8517019c2417505c38e8960aec" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.727909 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.819656 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94"] Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820115 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="extract-content" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820144 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="extract-content" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820160 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="extract-content" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820169 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="extract-content" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820179 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb8e909-5f3f-4076-b549-d489f37cd8e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820188 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb8e909-5f3f-4076-b549-d489f37cd8e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820205 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="extract-utilities" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820211 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="extract-utilities" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820223 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820230 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820240 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="extract-utilities" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820245 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="extract-utilities" Nov 27 17:16:14 crc kubenswrapper[4954]: E1127 17:16:14.820271 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820279 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820471 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c388b6-f7c3-45dd-9394-634d626bc496" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820487 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb8e909-5f3f-4076-b549-d489f37cd8e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.820506 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b3c60-ddc6-4616-9d5a-f77129477364" containerName="registry-server" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.821238 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.824074 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.826490 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.826785 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.827031 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.828130 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.851404 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94"] Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.889414 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.889466 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.889604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.889761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.889802 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlt6\" (UniqueName: \"kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.991004 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.991446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlt6\" (UniqueName: \"kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.991601 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.991641 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.991671 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.992498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.997527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:14 crc kubenswrapper[4954]: I1127 17:16:14.998954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:15 crc kubenswrapper[4954]: I1127 17:16:15.000556 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:15 crc kubenswrapper[4954]: I1127 17:16:15.009409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlt6\" (UniqueName: \"kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wzf94\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:15 crc kubenswrapper[4954]: I1127 17:16:15.150477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:16:15 crc kubenswrapper[4954]: I1127 17:16:15.652873 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94"] Nov 27 17:16:15 crc kubenswrapper[4954]: I1127 17:16:15.737256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" event={"ID":"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc","Type":"ContainerStarted","Data":"8e4a400dd5b7d7b773a2e90c0bda55f4b39b5c09969c81a4dafc838e64425d68"} Nov 27 17:16:16 crc kubenswrapper[4954]: I1127 17:16:16.748980 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" event={"ID":"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc","Type":"ContainerStarted","Data":"d0db803cf87bd9236deacb6991569d0069a1ffff364267a8eb5a4c34471126aa"} Nov 27 17:16:16 crc kubenswrapper[4954]: I1127 17:16:16.771540 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" podStartSLOduration=2.43718288 podStartE2EDuration="2.771516916s" podCreationTimestamp="2025-11-27 17:16:14 +0000 UTC" firstStartedPulling="2025-11-27 17:16:15.662928495 +0000 UTC m=+2287.680368805" lastFinishedPulling="2025-11-27 17:16:15.997262541 +0000 UTC m=+2288.014702841" observedRunningTime="2025-11-27 17:16:16.768302209 +0000 UTC m=+2288.785742519" watchObservedRunningTime="2025-11-27 17:16:16.771516916 +0000 UTC m=+2288.788957206" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.325908 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.329196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.359670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnbbc\" (UniqueName: \"kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.359759 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.360279 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.364700 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.467102 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.467193 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnbbc\" (UniqueName: \"kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.467220 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.467977 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.468062 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.492287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnbbc\" (UniqueName: \"kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc\") pod \"redhat-marketplace-87g2l\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:30 crc kubenswrapper[4954]: I1127 17:16:30.678339 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:31 crc kubenswrapper[4954]: I1127 17:16:31.177547 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:31 crc kubenswrapper[4954]: I1127 17:16:31.924002 4954 generic.go:334] "Generic (PLEG): container finished" podID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerID="51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf" exitCode=0 Nov 27 17:16:31 crc kubenswrapper[4954]: I1127 17:16:31.924094 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerDied","Data":"51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf"} Nov 27 17:16:31 crc kubenswrapper[4954]: I1127 17:16:31.924358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerStarted","Data":"db51fc37471f2a5c5d211b1da8791277f918425f7c104f9d04c6da290692516f"} Nov 27 17:16:32 crc kubenswrapper[4954]: I1127 17:16:32.937170 4954 generic.go:334] "Generic (PLEG): container finished" podID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerID="47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b" exitCode=0 Nov 27 17:16:32 crc kubenswrapper[4954]: I1127 17:16:32.937275 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerDied","Data":"47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b"} Nov 27 17:16:33 crc kubenswrapper[4954]: I1127 17:16:33.952417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerStarted","Data":"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd"} Nov 27 17:16:33 crc kubenswrapper[4954]: I1127 17:16:33.975610 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-87g2l" podStartSLOduration=2.377713975 podStartE2EDuration="3.975561836s" podCreationTimestamp="2025-11-27 17:16:30 +0000 UTC" firstStartedPulling="2025-11-27 17:16:31.926616869 +0000 UTC m=+2303.944057189" lastFinishedPulling="2025-11-27 17:16:33.52446475 +0000 UTC m=+2305.541905050" observedRunningTime="2025-11-27 17:16:33.970902865 +0000 UTC m=+2305.988343175" watchObservedRunningTime="2025-11-27 17:16:33.975561836 +0000 UTC m=+2305.993002156" Nov 27 17:16:40 crc kubenswrapper[4954]: I1127 17:16:40.679083 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:40 crc kubenswrapper[4954]: I1127 17:16:40.681964 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:40 crc kubenswrapper[4954]: I1127 17:16:40.734697 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:41 crc kubenswrapper[4954]: I1127 17:16:41.064439 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:41 crc kubenswrapper[4954]: I1127 17:16:41.122728 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.030569 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-87g2l" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="registry-server" containerID="cri-o://d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd" gracePeriod=2 Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.509002 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.560968 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities\") pod \"040cd10c-41ee-4a46-b908-99e7297b81a0\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.561073 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnbbc\" (UniqueName: \"kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc\") pod \"040cd10c-41ee-4a46-b908-99e7297b81a0\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.565686 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content\") pod \"040cd10c-41ee-4a46-b908-99e7297b81a0\" (UID: \"040cd10c-41ee-4a46-b908-99e7297b81a0\") " Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.566129 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities" (OuterVolumeSpecName: "utilities") pod "040cd10c-41ee-4a46-b908-99e7297b81a0" (UID: "040cd10c-41ee-4a46-b908-99e7297b81a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.567766 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.572237 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc" (OuterVolumeSpecName: "kube-api-access-gnbbc") pod "040cd10c-41ee-4a46-b908-99e7297b81a0" (UID: "040cd10c-41ee-4a46-b908-99e7297b81a0"). InnerVolumeSpecName "kube-api-access-gnbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.594972 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "040cd10c-41ee-4a46-b908-99e7297b81a0" (UID: "040cd10c-41ee-4a46-b908-99e7297b81a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.670223 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040cd10c-41ee-4a46-b908-99e7297b81a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:43 crc kubenswrapper[4954]: I1127 17:16:43.670278 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnbbc\" (UniqueName: \"kubernetes.io/projected/040cd10c-41ee-4a46-b908-99e7297b81a0-kube-api-access-gnbbc\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.041591 4954 generic.go:334] "Generic (PLEG): container finished" podID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerID="d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd" exitCode=0 Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.041632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerDied","Data":"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd"} Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.041684 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87g2l" event={"ID":"040cd10c-41ee-4a46-b908-99e7297b81a0","Type":"ContainerDied","Data":"db51fc37471f2a5c5d211b1da8791277f918425f7c104f9d04c6da290692516f"} Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.041689 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87g2l" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.041706 4954 scope.go:117] "RemoveContainer" containerID="d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.067643 4954 scope.go:117] "RemoveContainer" containerID="47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.085030 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.091541 4954 scope.go:117] "RemoveContainer" containerID="51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.096607 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-87g2l"] Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.141044 4954 scope.go:117] "RemoveContainer" containerID="d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd" Nov 27 17:16:44 crc kubenswrapper[4954]: E1127 17:16:44.142037 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd\": container with ID starting with d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd not found: ID does not exist" containerID="d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.142078 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd"} err="failed to get container status \"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd\": rpc error: code = NotFound desc = could not find container \"d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd\": container with ID starting with d95219b302fb4d6b74e31c621e7ab68262bcdc161918cac3bdfd65d1e483dadd not found: ID does not exist" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.142107 4954 scope.go:117] "RemoveContainer" containerID="47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b" Nov 27 17:16:44 crc kubenswrapper[4954]: E1127 17:16:44.142642 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b\": container with ID starting with 47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b not found: ID does not exist" containerID="47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.142714 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b"} err="failed to get container status \"47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b\": rpc error: code = NotFound desc = could not find container \"47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b\": container with ID starting with 47c90745dac15b2b843d22463e175ea8b05630756ff3b901e33d359aa2faa80b not found: ID does not exist" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.142730 4954 scope.go:117] "RemoveContainer" containerID="51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf" Nov 27 17:16:44 crc kubenswrapper[4954]: E1127 17:16:44.143449 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf\": container with ID starting with 51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf not found: ID does not exist" containerID="51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.143491 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf"} err="failed to get container status \"51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf\": rpc error: code = NotFound desc = could not find container \"51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf\": container with ID starting with 51c0e2999c53a8ceec60371b1609b6824eceddbd96bce64dd0078b9227dce7bf not found: ID does not exist" Nov 27 17:16:44 crc kubenswrapper[4954]: I1127 17:16:44.672373 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" path="/var/lib/kubelet/pods/040cd10c-41ee-4a46-b908-99e7297b81a0/volumes" Nov 27 17:16:53 crc kubenswrapper[4954]: I1127 17:16:53.687810 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:16:53 crc kubenswrapper[4954]: I1127 17:16:53.688871 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:17:22 crc kubenswrapper[4954]: I1127 17:17:22.370988 4954 generic.go:334] "Generic (PLEG): container finished" podID="3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" containerID="d0db803cf87bd9236deacb6991569d0069a1ffff364267a8eb5a4c34471126aa" exitCode=0 Nov 27 17:17:22 crc kubenswrapper[4954]: I1127 17:17:22.371105 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" event={"ID":"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc","Type":"ContainerDied","Data":"d0db803cf87bd9236deacb6991569d0069a1ffff364267a8eb5a4c34471126aa"} Nov 27 17:17:23 crc kubenswrapper[4954]: I1127 17:17:23.687270 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:17:23 crc kubenswrapper[4954]: I1127 17:17:23.687826 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:17:23 crc kubenswrapper[4954]: I1127 17:17:23.869903 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.044601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle\") pod \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.045088 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key\") pod \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.045440 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory\") pod \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.045464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0\") pod \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.045498 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlt6\" (UniqueName: \"kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6\") pod \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\" (UID: \"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc\") " Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.052421 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6" (OuterVolumeSpecName: "kube-api-access-xnlt6") pod "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" (UID: "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc"). InnerVolumeSpecName "kube-api-access-xnlt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.052449 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" (UID: "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.077966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" (UID: "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.112444 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory" (OuterVolumeSpecName: "inventory") pod "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" (UID: "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.119790 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" (UID: "3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.147911 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.147955 4954 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.147971 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlt6\" (UniqueName: \"kubernetes.io/projected/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-kube-api-access-xnlt6\") on node \"crc\" DevicePath \"\"" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.147983 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.147997 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.395225 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" event={"ID":"3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc","Type":"ContainerDied","Data":"8e4a400dd5b7d7b773a2e90c0bda55f4b39b5c09969c81a4dafc838e64425d68"} Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.395275 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4a400dd5b7d7b773a2e90c0bda55f4b39b5c09969c81a4dafc838e64425d68" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.395287 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wzf94" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526079 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp"] Nov 27 17:17:24 crc kubenswrapper[4954]: E1127 17:17:24.526459 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526473 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 17:17:24 crc kubenswrapper[4954]: E1127 17:17:24.526485 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="extract-content" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526492 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="extract-content" Nov 27 17:17:24 crc kubenswrapper[4954]: E1127 17:17:24.526507 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="extract-utilities" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526513 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="extract-utilities" Nov 27 17:17:24 crc kubenswrapper[4954]: E1127 17:17:24.526523 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="registry-server" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526529 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="registry-server" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526722 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.526734 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="040cd10c-41ee-4a46-b908-99e7297b81a0" containerName="registry-server" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.527350 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.530198 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.530693 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.530928 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.531227 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.531424 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.537719 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.538570 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp"] Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.560464 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjclv\" (UniqueName: \"kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.560673 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.560907 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.561000 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.561055 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.561121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.661900 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.661965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.661997 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.662028 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.662057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjclv\" (UniqueName: \"kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.662110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.665614 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.665971 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.665978 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.666077 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.666765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.678834 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjclv\" (UniqueName: \"kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:24 crc kubenswrapper[4954]: I1127 17:17:24.848365 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:17:25 crc kubenswrapper[4954]: I1127 17:17:25.434718 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp"] Nov 27 17:17:26 crc kubenswrapper[4954]: I1127 17:17:26.420148 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" event={"ID":"5ea501ba-5c0c-4392-a64b-695c832dbb89","Type":"ContainerStarted","Data":"3ea714dbcbac994ab665d4518e63736dbae95be5fb9e34971d0b62a540e08cee"} Nov 27 17:17:26 crc kubenswrapper[4954]: I1127 17:17:26.420490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" event={"ID":"5ea501ba-5c0c-4392-a64b-695c832dbb89","Type":"ContainerStarted","Data":"16bb8dca7390158bfceeb5e6802f65f5853113cd4b887ec192ba4d6005f1dff1"} Nov 27 17:17:26 crc kubenswrapper[4954]: I1127 17:17:26.445746 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" podStartSLOduration=2.2591174880000002 podStartE2EDuration="2.445722512s" podCreationTimestamp="2025-11-27 17:17:24 +0000 UTC" firstStartedPulling="2025-11-27 17:17:25.442121649 +0000 UTC m=+2357.459561959" lastFinishedPulling="2025-11-27 17:17:25.628726683 +0000 UTC m=+2357.646166983" observedRunningTime="2025-11-27 17:17:26.440087067 +0000 UTC m=+2358.457527367" watchObservedRunningTime="2025-11-27 17:17:26.445722512 +0000 UTC m=+2358.463162812" Nov 27 17:17:53 crc kubenswrapper[4954]: I1127 17:17:53.687958 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:17:53 crc kubenswrapper[4954]: I1127 17:17:53.688456 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:17:53 crc kubenswrapper[4954]: I1127 17:17:53.688510 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:17:53 crc kubenswrapper[4954]: I1127 17:17:53.689282 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:17:53 crc kubenswrapper[4954]: I1127 17:17:53.689340 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" gracePeriod=600 Nov 27 17:17:53 crc kubenswrapper[4954]: E1127 17:17:53.808775 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:17:54 crc kubenswrapper[4954]: I1127 17:17:54.659839 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" exitCode=0 Nov 27 17:17:54 crc kubenswrapper[4954]: I1127 17:17:54.659895 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6"} Nov 27 17:17:54 crc kubenswrapper[4954]: I1127 17:17:54.659939 4954 scope.go:117] "RemoveContainer" containerID="3bca7cd4e28cd5886de60bf3081238598be0a5e41895389e224c4122b00d90d8" Nov 27 17:17:54 crc kubenswrapper[4954]: I1127 17:17:54.661047 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:17:54 crc kubenswrapper[4954]: E1127 17:17:54.661618 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:18:09 crc kubenswrapper[4954]: I1127 17:18:09.662044 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:18:09 crc kubenswrapper[4954]: E1127 17:18:09.662723 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:18:16 crc kubenswrapper[4954]: I1127 17:18:16.881268 4954 generic.go:334] "Generic (PLEG): container finished" podID="5ea501ba-5c0c-4392-a64b-695c832dbb89" containerID="3ea714dbcbac994ab665d4518e63736dbae95be5fb9e34971d0b62a540e08cee" exitCode=0 Nov 27 17:18:16 crc kubenswrapper[4954]: I1127 17:18:16.881421 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" event={"ID":"5ea501ba-5c0c-4392-a64b-695c832dbb89","Type":"ContainerDied","Data":"3ea714dbcbac994ab665d4518e63736dbae95be5fb9e34971d0b62a540e08cee"} Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.340716 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.515275 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjclv\" (UniqueName: \"kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.516164 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.516317 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.516436 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.516637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.517123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5ea501ba-5c0c-4392-a64b-695c832dbb89\" (UID: \"5ea501ba-5c0c-4392-a64b-695c832dbb89\") " Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.521808 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv" (OuterVolumeSpecName: "kube-api-access-rjclv") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "kube-api-access-rjclv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.522016 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.545225 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory" (OuterVolumeSpecName: "inventory") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.545742 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.550073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.551850 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5ea501ba-5c0c-4392-a64b-695c832dbb89" (UID: "5ea501ba-5c0c-4392-a64b-695c832dbb89"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619491 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619524 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619534 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619546 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619557 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjclv\" (UniqueName: \"kubernetes.io/projected/5ea501ba-5c0c-4392-a64b-695c832dbb89-kube-api-access-rjclv\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.619567 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ea501ba-5c0c-4392-a64b-695c832dbb89-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.902898 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" event={"ID":"5ea501ba-5c0c-4392-a64b-695c832dbb89","Type":"ContainerDied","Data":"16bb8dca7390158bfceeb5e6802f65f5853113cd4b887ec192ba4d6005f1dff1"} Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.902961 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16bb8dca7390158bfceeb5e6802f65f5853113cd4b887ec192ba4d6005f1dff1" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.902969 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.998915 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw"] Nov 27 17:18:18 crc kubenswrapper[4954]: E1127 17:18:18.999268 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea501ba-5c0c-4392-a64b-695c832dbb89" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.999286 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea501ba-5c0c-4392-a64b-695c832dbb89" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 17:18:18 crc kubenswrapper[4954]: I1127 17:18:18.999487 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea501ba-5c0c-4392-a64b-695c832dbb89" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.000314 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.002258 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.002356 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.008509 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.013630 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.013627 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.017995 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw"] Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.128730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trmmv\" (UniqueName: \"kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.128804 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.128832 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.129178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.129373 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.231358 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.231435 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.231612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.231678 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.231760 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trmmv\" (UniqueName: \"kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.236668 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.244156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.244155 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.245298 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.252507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trmmv\" (UniqueName: \"kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-flwcw\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.320042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.870857 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw"] Nov 27 17:18:19 crc kubenswrapper[4954]: I1127 17:18:19.917870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" event={"ID":"6d34dbe8-0864-4b92-bd50-5bdd57209a74","Type":"ContainerStarted","Data":"6875f5ab6e5f084c55ef626fbe6b4f376c57f459459399c4503ed61a6b2fbf2f"} Nov 27 17:18:20 crc kubenswrapper[4954]: I1127 17:18:20.928612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" event={"ID":"6d34dbe8-0864-4b92-bd50-5bdd57209a74","Type":"ContainerStarted","Data":"afd3b879edc5f302762f00b75e8358e913edb462d405e199b9032ed0eab17a0d"} Nov 27 17:18:20 crc kubenswrapper[4954]: I1127 17:18:20.945389 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" podStartSLOduration=2.732714992 podStartE2EDuration="2.94536617s" podCreationTimestamp="2025-11-27 17:18:18 +0000 UTC" firstStartedPulling="2025-11-27 17:18:19.875339014 +0000 UTC m=+2411.892779314" lastFinishedPulling="2025-11-27 17:18:20.087990192 +0000 UTC m=+2412.105430492" observedRunningTime="2025-11-27 17:18:20.944035348 +0000 UTC m=+2412.961475668" watchObservedRunningTime="2025-11-27 17:18:20.94536617 +0000 UTC m=+2412.962806470" Nov 27 17:18:24 crc kubenswrapper[4954]: I1127 17:18:24.662168 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:18:24 crc kubenswrapper[4954]: E1127 17:18:24.663033 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:18:39 crc kubenswrapper[4954]: I1127 17:18:39.662512 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:18:39 crc kubenswrapper[4954]: E1127 17:18:39.663758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:18:53 crc kubenswrapper[4954]: I1127 17:18:53.663818 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:18:53 crc kubenswrapper[4954]: E1127 17:18:53.664778 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:19:05 crc kubenswrapper[4954]: I1127 17:19:05.664921 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:19:05 crc kubenswrapper[4954]: E1127 17:19:05.665820 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:19:17 crc kubenswrapper[4954]: I1127 17:19:17.662440 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:19:17 crc kubenswrapper[4954]: E1127 17:19:17.663387 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:19:30 crc kubenswrapper[4954]: I1127 17:19:30.662730 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:19:30 crc kubenswrapper[4954]: E1127 17:19:30.663619 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:19:44 crc kubenswrapper[4954]: I1127 17:19:44.662540 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:19:44 crc kubenswrapper[4954]: E1127 17:19:44.663327 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:19:56 crc kubenswrapper[4954]: I1127 17:19:56.662186 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:19:56 crc kubenswrapper[4954]: E1127 17:19:56.662920 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:20:07 crc kubenswrapper[4954]: I1127 17:20:07.662107 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:20:07 crc kubenswrapper[4954]: E1127 17:20:07.662814 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:20:22 crc kubenswrapper[4954]: I1127 17:20:22.661806 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:20:22 crc kubenswrapper[4954]: E1127 17:20:22.662741 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:20:37 crc kubenswrapper[4954]: I1127 17:20:37.662430 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:20:37 crc kubenswrapper[4954]: E1127 17:20:37.663305 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:20:48 crc kubenswrapper[4954]: I1127 17:20:48.671934 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:20:48 crc kubenswrapper[4954]: E1127 17:20:48.673509 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:21:03 crc kubenswrapper[4954]: I1127 17:21:03.662339 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:21:03 crc kubenswrapper[4954]: E1127 17:21:03.663246 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:21:18 crc kubenswrapper[4954]: I1127 17:21:18.668761 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:21:18 crc kubenswrapper[4954]: E1127 17:21:18.669818 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:21:32 crc kubenswrapper[4954]: I1127 17:21:32.662708 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:21:32 crc kubenswrapper[4954]: E1127 17:21:32.663557 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:21:45 crc kubenswrapper[4954]: I1127 17:21:45.662097 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:21:45 crc kubenswrapper[4954]: E1127 17:21:45.662891 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:22:00 crc kubenswrapper[4954]: I1127 17:22:00.662460 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:22:00 crc kubenswrapper[4954]: E1127 17:22:00.663155 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:22:12 crc kubenswrapper[4954]: I1127 17:22:12.662315 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:22:12 crc kubenswrapper[4954]: E1127 17:22:12.663040 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:22:27 crc kubenswrapper[4954]: I1127 17:22:27.663028 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:22:27 crc kubenswrapper[4954]: E1127 17:22:27.664022 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:22:43 crc kubenswrapper[4954]: I1127 17:22:43.662513 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:22:43 crc kubenswrapper[4954]: E1127 17:22:43.663697 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:22:46 crc kubenswrapper[4954]: I1127 17:22:46.466505 4954 generic.go:334] "Generic (PLEG): container finished" podID="6d34dbe8-0864-4b92-bd50-5bdd57209a74" containerID="afd3b879edc5f302762f00b75e8358e913edb462d405e199b9032ed0eab17a0d" exitCode=0 Nov 27 17:22:46 crc kubenswrapper[4954]: I1127 17:22:46.466607 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" event={"ID":"6d34dbe8-0864-4b92-bd50-5bdd57209a74","Type":"ContainerDied","Data":"afd3b879edc5f302762f00b75e8358e913edb462d405e199b9032ed0eab17a0d"} Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.857366 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.972960 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key\") pod \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.973035 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory\") pod \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.973129 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle\") pod \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.973178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0\") pod \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.973243 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trmmv\" (UniqueName: \"kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv\") pod \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\" (UID: \"6d34dbe8-0864-4b92-bd50-5bdd57209a74\") " Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.979318 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6d34dbe8-0864-4b92-bd50-5bdd57209a74" (UID: "6d34dbe8-0864-4b92-bd50-5bdd57209a74"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:22:47 crc kubenswrapper[4954]: I1127 17:22:47.981512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv" (OuterVolumeSpecName: "kube-api-access-trmmv") pod "6d34dbe8-0864-4b92-bd50-5bdd57209a74" (UID: "6d34dbe8-0864-4b92-bd50-5bdd57209a74"). InnerVolumeSpecName "kube-api-access-trmmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.002596 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6d34dbe8-0864-4b92-bd50-5bdd57209a74" (UID: "6d34dbe8-0864-4b92-bd50-5bdd57209a74"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.004079 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory" (OuterVolumeSpecName: "inventory") pod "6d34dbe8-0864-4b92-bd50-5bdd57209a74" (UID: "6d34dbe8-0864-4b92-bd50-5bdd57209a74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.004646 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6d34dbe8-0864-4b92-bd50-5bdd57209a74" (UID: "6d34dbe8-0864-4b92-bd50-5bdd57209a74"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.075822 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trmmv\" (UniqueName: \"kubernetes.io/projected/6d34dbe8-0864-4b92-bd50-5bdd57209a74-kube-api-access-trmmv\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.075855 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.075868 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.075878 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.075887 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6d34dbe8-0864-4b92-bd50-5bdd57209a74-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.482879 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" event={"ID":"6d34dbe8-0864-4b92-bd50-5bdd57209a74","Type":"ContainerDied","Data":"6875f5ab6e5f084c55ef626fbe6b4f376c57f459459399c4503ed61a6b2fbf2f"} Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.483172 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6875f5ab6e5f084c55ef626fbe6b4f376c57f459459399c4503ed61a6b2fbf2f" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.482942 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-flwcw" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.584012 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s"] Nov 27 17:22:48 crc kubenswrapper[4954]: E1127 17:22:48.584484 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d34dbe8-0864-4b92-bd50-5bdd57209a74" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.584514 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d34dbe8-0864-4b92-bd50-5bdd57209a74" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.584792 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d34dbe8-0864-4b92-bd50-5bdd57209a74" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.585605 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.591277 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.591279 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.591703 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.591963 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.603127 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.603402 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.603653 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.621973 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s"] Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685613 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685709 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685887 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685925 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlcn\" (UniqueName: \"kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.685988 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.686038 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.686116 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.686181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlcn\" (UniqueName: \"kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788809 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788846 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788903 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.788977 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.789027 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.789063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.789595 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.789991 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.793805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.793876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.793888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.794066 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.794372 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.795131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.795592 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.806886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlcn\" (UniqueName: \"kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hq64s\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:48 crc kubenswrapper[4954]: I1127 17:22:48.905346 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:22:49 crc kubenswrapper[4954]: I1127 17:22:49.402292 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s"] Nov 27 17:22:49 crc kubenswrapper[4954]: I1127 17:22:49.412065 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:22:49 crc kubenswrapper[4954]: I1127 17:22:49.491611 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" event={"ID":"7ab77d00-245a-41d2-a223-1caff56f23da","Type":"ContainerStarted","Data":"a7e8deb5a2ba0aed893cc4df75c49407961641fac0d6514169c8beeb95c9b968"} Nov 27 17:22:50 crc kubenswrapper[4954]: I1127 17:22:50.501913 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" event={"ID":"7ab77d00-245a-41d2-a223-1caff56f23da","Type":"ContainerStarted","Data":"77b57be63e63964263e95a6e98ccf40aabdcf3299f85e5ffb88b58796657436f"} Nov 27 17:22:50 crc kubenswrapper[4954]: I1127 17:22:50.519102 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" podStartSLOduration=2.287636195 podStartE2EDuration="2.519079207s" podCreationTimestamp="2025-11-27 17:22:48 +0000 UTC" firstStartedPulling="2025-11-27 17:22:49.411824021 +0000 UTC m=+2681.429264321" lastFinishedPulling="2025-11-27 17:22:49.643267033 +0000 UTC m=+2681.660707333" observedRunningTime="2025-11-27 17:22:50.516079336 +0000 UTC m=+2682.533519636" watchObservedRunningTime="2025-11-27 17:22:50.519079207 +0000 UTC m=+2682.536519507" Nov 27 17:22:57 crc kubenswrapper[4954]: I1127 17:22:57.662347 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:22:58 crc kubenswrapper[4954]: I1127 17:22:58.578529 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca"} Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.108553 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.111598 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.137413 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.205724 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.205798 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llh6g\" (UniqueName: \"kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.205872 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.308049 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.308119 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llh6g\" (UniqueName: \"kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.308176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.308800 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.308872 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.331173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llh6g\" (UniqueName: \"kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g\") pod \"redhat-operators-qspdp\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.454073 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:45 crc kubenswrapper[4954]: I1127 17:23:45.977977 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:23:46 crc kubenswrapper[4954]: I1127 17:23:46.532909 4954 generic.go:334] "Generic (PLEG): container finished" podID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerID="59549c8213591c37be8e3f09489a05ec929db57dce959deb32de7b5b20ddcfb6" exitCode=0 Nov 27 17:23:46 crc kubenswrapper[4954]: I1127 17:23:46.532979 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerDied","Data":"59549c8213591c37be8e3f09489a05ec929db57dce959deb32de7b5b20ddcfb6"} Nov 27 17:23:46 crc kubenswrapper[4954]: I1127 17:23:46.533023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerStarted","Data":"575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005"} Nov 27 17:23:49 crc kubenswrapper[4954]: I1127 17:23:49.564400 4954 generic.go:334] "Generic (PLEG): container finished" podID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerID="6dd7aa50ff4f7d57847da89717a89059381f39ca5b1db968022b50f7ebbb02f9" exitCode=0 Nov 27 17:23:49 crc kubenswrapper[4954]: I1127 17:23:49.564487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerDied","Data":"6dd7aa50ff4f7d57847da89717a89059381f39ca5b1db968022b50f7ebbb02f9"} Nov 27 17:23:51 crc kubenswrapper[4954]: I1127 17:23:51.595227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerStarted","Data":"2c53c19b341521803a0e24703178573d2e173a176b6435bd8c625ef8e37df35c"} Nov 27 17:23:51 crc kubenswrapper[4954]: I1127 17:23:51.630422 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qspdp" podStartSLOduration=2.789747762 podStartE2EDuration="6.630370594s" podCreationTimestamp="2025-11-27 17:23:45 +0000 UTC" firstStartedPulling="2025-11-27 17:23:46.534936722 +0000 UTC m=+2738.552377012" lastFinishedPulling="2025-11-27 17:23:50.375559544 +0000 UTC m=+2742.392999844" observedRunningTime="2025-11-27 17:23:51.615351896 +0000 UTC m=+2743.632792206" watchObservedRunningTime="2025-11-27 17:23:51.630370594 +0000 UTC m=+2743.647810894" Nov 27 17:23:55 crc kubenswrapper[4954]: I1127 17:23:55.455018 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:55 crc kubenswrapper[4954]: I1127 17:23:55.455616 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:23:56 crc kubenswrapper[4954]: I1127 17:23:56.511333 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qspdp" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="registry-server" probeResult="failure" output=< Nov 27 17:23:56 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Nov 27 17:23:56 crc kubenswrapper[4954]: > Nov 27 17:24:05 crc kubenswrapper[4954]: I1127 17:24:05.519800 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:24:05 crc kubenswrapper[4954]: I1127 17:24:05.570833 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:24:05 crc kubenswrapper[4954]: I1127 17:24:05.758288 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:24:06 crc kubenswrapper[4954]: I1127 17:24:06.723663 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qspdp" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="registry-server" containerID="cri-o://2c53c19b341521803a0e24703178573d2e173a176b6435bd8c625ef8e37df35c" gracePeriod=2 Nov 27 17:24:07 crc kubenswrapper[4954]: I1127 17:24:07.735482 4954 generic.go:334] "Generic (PLEG): container finished" podID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerID="2c53c19b341521803a0e24703178573d2e173a176b6435bd8c625ef8e37df35c" exitCode=0 Nov 27 17:24:07 crc kubenswrapper[4954]: I1127 17:24:07.735698 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerDied","Data":"2c53c19b341521803a0e24703178573d2e173a176b6435bd8c625ef8e37df35c"} Nov 27 17:24:07 crc kubenswrapper[4954]: I1127 17:24:07.863662 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.012321 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content\") pod \"57a2722f-1042-4fd8-87c2-8d5a3197503c\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.012509 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities\") pod \"57a2722f-1042-4fd8-87c2-8d5a3197503c\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.012860 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llh6g\" (UniqueName: \"kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g\") pod \"57a2722f-1042-4fd8-87c2-8d5a3197503c\" (UID: \"57a2722f-1042-4fd8-87c2-8d5a3197503c\") " Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.013321 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities" (OuterVolumeSpecName: "utilities") pod "57a2722f-1042-4fd8-87c2-8d5a3197503c" (UID: "57a2722f-1042-4fd8-87c2-8d5a3197503c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.019281 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g" (OuterVolumeSpecName: "kube-api-access-llh6g") pod "57a2722f-1042-4fd8-87c2-8d5a3197503c" (UID: "57a2722f-1042-4fd8-87c2-8d5a3197503c"). InnerVolumeSpecName "kube-api-access-llh6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.115427 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llh6g\" (UniqueName: \"kubernetes.io/projected/57a2722f-1042-4fd8-87c2-8d5a3197503c-kube-api-access-llh6g\") on node \"crc\" DevicePath \"\"" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.115460 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.116797 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a2722f-1042-4fd8-87c2-8d5a3197503c" (UID: "57a2722f-1042-4fd8-87c2-8d5a3197503c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.216787 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a2722f-1042-4fd8-87c2-8d5a3197503c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.746657 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qspdp" event={"ID":"57a2722f-1042-4fd8-87c2-8d5a3197503c","Type":"ContainerDied","Data":"575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005"} Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.746732 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qspdp" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.746943 4954 scope.go:117] "RemoveContainer" containerID="2c53c19b341521803a0e24703178573d2e173a176b6435bd8c625ef8e37df35c" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.769035 4954 scope.go:117] "RemoveContainer" containerID="6dd7aa50ff4f7d57847da89717a89059381f39ca5b1db968022b50f7ebbb02f9" Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.770933 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.779177 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qspdp"] Nov 27 17:24:08 crc kubenswrapper[4954]: I1127 17:24:08.791904 4954 scope.go:117] "RemoveContainer" containerID="59549c8213591c37be8e3f09489a05ec929db57dce959deb32de7b5b20ddcfb6" Nov 27 17:24:10 crc kubenswrapper[4954]: I1127 17:24:10.673505 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" path="/var/lib/kubelet/pods/57a2722f-1042-4fd8-87c2-8d5a3197503c/volumes" Nov 27 17:24:18 crc kubenswrapper[4954]: E1127 17:24:18.648192 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice/crio-575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005\": RecentStats: unable to find data in memory cache]" Nov 27 17:24:28 crc kubenswrapper[4954]: E1127 17:24:28.899424 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice/crio-575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005\": RecentStats: unable to find data in memory cache]" Nov 27 17:24:39 crc kubenswrapper[4954]: E1127 17:24:39.152617 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice/crio-575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:24:49 crc kubenswrapper[4954]: E1127 17:24:49.399103 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice/crio-575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice\": RecentStats: unable to find data in memory cache]" Nov 27 17:24:59 crc kubenswrapper[4954]: E1127 17:24:59.662311 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a2722f_1042_4fd8_87c2_8d5a3197503c.slice/crio-575519effee7bf389dbaca87c1cd144323382fd59c147e1b0915863b2b675005\": RecentStats: unable to find data in memory cache]" Nov 27 17:25:08 crc kubenswrapper[4954]: E1127 17:25:08.704938 4954 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/dd52146f0897f02c373ddf62ef8f7ddee4f35012bc4b9a061c2b512252726618/diff" to get inode usage: stat /var/lib/containers/storage/overlay/dd52146f0897f02c373ddf62ef8f7ddee4f35012bc4b9a061c2b512252726618/diff: no such file or directory, extraDiskErr: Nov 27 17:25:23 crc kubenswrapper[4954]: I1127 17:25:23.687438 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:25:23 crc kubenswrapper[4954]: I1127 17:25:23.688081 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.360062 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:41 crc kubenswrapper[4954]: E1127 17:25:41.361025 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="registry-server" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.361042 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="registry-server" Nov 27 17:25:41 crc kubenswrapper[4954]: E1127 17:25:41.361064 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="extract-utilities" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.361070 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="extract-utilities" Nov 27 17:25:41 crc kubenswrapper[4954]: E1127 17:25:41.361101 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="extract-content" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.361107 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="extract-content" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.361317 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a2722f-1042-4fd8-87c2-8d5a3197503c" containerName="registry-server" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.362830 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.389277 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.522332 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.522419 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzl8\" (UniqueName: \"kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.522474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.565675 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ab77d00-245a-41d2-a223-1caff56f23da" containerID="77b57be63e63964263e95a6e98ccf40aabdcf3299f85e5ffb88b58796657436f" exitCode=0 Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.565743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" event={"ID":"7ab77d00-245a-41d2-a223-1caff56f23da","Type":"ContainerDied","Data":"77b57be63e63964263e95a6e98ccf40aabdcf3299f85e5ffb88b58796657436f"} Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.624832 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.624917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzl8\" (UniqueName: \"kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.624978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.625331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.625427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.659266 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzl8\" (UniqueName: \"kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8\") pod \"community-operators-p2mkc\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:41 crc kubenswrapper[4954]: I1127 17:25:41.680505 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:42 crc kubenswrapper[4954]: I1127 17:25:42.210858 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:42 crc kubenswrapper[4954]: W1127 17:25:42.215567 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cab63c_b97e_4ac9_878a_43f50b88e5cb.slice/crio-47daba527d28a7e429b62f70f7764125bffac27437b32377635ba2b92caef49b WatchSource:0}: Error finding container 47daba527d28a7e429b62f70f7764125bffac27437b32377635ba2b92caef49b: Status 404 returned error can't find the container with id 47daba527d28a7e429b62f70f7764125bffac27437b32377635ba2b92caef49b Nov 27 17:25:42 crc kubenswrapper[4954]: I1127 17:25:42.574978 4954 generic.go:334] "Generic (PLEG): container finished" podID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerID="e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255" exitCode=0 Nov 27 17:25:42 crc kubenswrapper[4954]: I1127 17:25:42.575037 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerDied","Data":"e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255"} Nov 27 17:25:42 crc kubenswrapper[4954]: I1127 17:25:42.575374 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerStarted","Data":"47daba527d28a7e429b62f70f7764125bffac27437b32377635ba2b92caef49b"} Nov 27 17:25:42 crc kubenswrapper[4954]: I1127 17:25:42.985562 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.169538 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.169970 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.169998 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170026 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170095 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170111 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.170255 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxlcn\" (UniqueName: \"kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn\") pod \"7ab77d00-245a-41d2-a223-1caff56f23da\" (UID: \"7ab77d00-245a-41d2-a223-1caff56f23da\") " Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.183920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn" (OuterVolumeSpecName: "kube-api-access-qxlcn") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "kube-api-access-qxlcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.187987 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.201084 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.201570 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.204998 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.205952 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory" (OuterVolumeSpecName: "inventory") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.207773 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.216449 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.218784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7ab77d00-245a-41d2-a223-1caff56f23da" (UID: "7ab77d00-245a-41d2-a223-1caff56f23da"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272325 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272615 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxlcn\" (UniqueName: \"kubernetes.io/projected/7ab77d00-245a-41d2-a223-1caff56f23da-kube-api-access-qxlcn\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272692 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272760 4954 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7ab77d00-245a-41d2-a223-1caff56f23da-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272867 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.272933 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.273010 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.273073 4954 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.273136 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab77d00-245a-41d2-a223-1caff56f23da-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.585306 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" event={"ID":"7ab77d00-245a-41d2-a223-1caff56f23da","Type":"ContainerDied","Data":"a7e8deb5a2ba0aed893cc4df75c49407961641fac0d6514169c8beeb95c9b968"} Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.585350 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e8deb5a2ba0aed893cc4df75c49407961641fac0d6514169c8beeb95c9b968" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.585404 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hq64s" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.692362 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v"] Nov 27 17:25:43 crc kubenswrapper[4954]: E1127 17:25:43.693019 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab77d00-245a-41d2-a223-1caff56f23da" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.693037 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab77d00-245a-41d2-a223-1caff56f23da" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.693242 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab77d00-245a-41d2-a223-1caff56f23da" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.695764 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.708929 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v"] Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.709208 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.709336 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lnfbp" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.709523 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.709799 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.709949 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885319 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885533 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476jt\" (UniqueName: \"kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885813 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.885953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988200 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988304 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988360 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476jt\" (UniqueName: \"kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988426 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:43 crc kubenswrapper[4954]: I1127 17:25:43.988510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.994417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.994515 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.994423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.995605 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.996792 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:43.997236 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:44.006429 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476jt\" (UniqueName: \"kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r476v\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:44.020637 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:44.599080 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v"] Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:44.600819 4954 generic.go:334] "Generic (PLEG): container finished" podID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerID="0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c" exitCode=0 Nov 27 17:25:44 crc kubenswrapper[4954]: I1127 17:25:44.600871 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerDied","Data":"0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c"} Nov 27 17:25:45 crc kubenswrapper[4954]: I1127 17:25:45.611818 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" event={"ID":"200fb5dd-f5ad-4f82-8a9c-e8e378075448","Type":"ContainerStarted","Data":"31f1885ec827cdfa4f1b54cebbae4d4ddeb4cf8117f97db54386e47e3be1478f"} Nov 27 17:25:45 crc kubenswrapper[4954]: I1127 17:25:45.612481 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" event={"ID":"200fb5dd-f5ad-4f82-8a9c-e8e378075448","Type":"ContainerStarted","Data":"b1afdb2682f656d8d0d968ba42e29cdbc3f2a9ebfc39d524769c0732375369a2"} Nov 27 17:25:45 crc kubenswrapper[4954]: I1127 17:25:45.614433 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerStarted","Data":"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca"} Nov 27 17:25:45 crc kubenswrapper[4954]: I1127 17:25:45.647625 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" podStartSLOduration=2.269908925 podStartE2EDuration="2.647570615s" podCreationTimestamp="2025-11-27 17:25:43 +0000 UTC" firstStartedPulling="2025-11-27 17:25:44.599094375 +0000 UTC m=+2856.616534685" lastFinishedPulling="2025-11-27 17:25:44.976756075 +0000 UTC m=+2856.994196375" observedRunningTime="2025-11-27 17:25:45.628891369 +0000 UTC m=+2857.646331669" watchObservedRunningTime="2025-11-27 17:25:45.647570615 +0000 UTC m=+2857.665010915" Nov 27 17:25:45 crc kubenswrapper[4954]: I1127 17:25:45.664840 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p2mkc" podStartSLOduration=2.022621219 podStartE2EDuration="4.664816868s" podCreationTimestamp="2025-11-27 17:25:41 +0000 UTC" firstStartedPulling="2025-11-27 17:25:42.57853719 +0000 UTC m=+2854.595977490" lastFinishedPulling="2025-11-27 17:25:45.220732849 +0000 UTC m=+2857.238173139" observedRunningTime="2025-11-27 17:25:45.660051133 +0000 UTC m=+2857.677491433" watchObservedRunningTime="2025-11-27 17:25:45.664816868 +0000 UTC m=+2857.682257168" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.720906 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.723443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.733410 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.766601 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.766946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.767029 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbsj\" (UniqueName: \"kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.870023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbsj\" (UniqueName: \"kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.870211 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.870785 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.870246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.870871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:46 crc kubenswrapper[4954]: I1127 17:25:46.888736 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbsj\" (UniqueName: \"kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj\") pod \"certified-operators-8qwz8\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:47 crc kubenswrapper[4954]: I1127 17:25:47.064523 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:47 crc kubenswrapper[4954]: I1127 17:25:47.591990 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:25:47 crc kubenswrapper[4954]: I1127 17:25:47.636156 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerStarted","Data":"c610eb867cd8b88e25474a7d682135b55f5fff62d3ded5bb5eb140afc1d11014"} Nov 27 17:25:48 crc kubenswrapper[4954]: I1127 17:25:48.655702 4954 generic.go:334] "Generic (PLEG): container finished" podID="7254dc05-f77c-4632-bea0-bbf673644239" containerID="f6ebf4cd26d5552ee44d48c596536ba6a6de397229c0b57f318fe26dcad03497" exitCode=0 Nov 27 17:25:48 crc kubenswrapper[4954]: I1127 17:25:48.655987 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerDied","Data":"f6ebf4cd26d5552ee44d48c596536ba6a6de397229c0b57f318fe26dcad03497"} Nov 27 17:25:49 crc kubenswrapper[4954]: I1127 17:25:49.667387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerStarted","Data":"38c9fa49421e4db0e9cac3f3f6f3c693910bca9000c0923e879b2a1ef2e16383"} Nov 27 17:25:50 crc kubenswrapper[4954]: I1127 17:25:50.678334 4954 generic.go:334] "Generic (PLEG): container finished" podID="7254dc05-f77c-4632-bea0-bbf673644239" containerID="38c9fa49421e4db0e9cac3f3f6f3c693910bca9000c0923e879b2a1ef2e16383" exitCode=0 Nov 27 17:25:50 crc kubenswrapper[4954]: I1127 17:25:50.678386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerDied","Data":"38c9fa49421e4db0e9cac3f3f6f3c693910bca9000c0923e879b2a1ef2e16383"} Nov 27 17:25:51 crc kubenswrapper[4954]: I1127 17:25:51.680715 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:51 crc kubenswrapper[4954]: I1127 17:25:51.681265 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:51 crc kubenswrapper[4954]: I1127 17:25:51.747147 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:52 crc kubenswrapper[4954]: I1127 17:25:52.717076 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerStarted","Data":"544c12f1213e2a4e5006334e3275096fd605ca5a77c764ad2e37375bdca163ea"} Nov 27 17:25:52 crc kubenswrapper[4954]: I1127 17:25:52.741163 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qwz8" podStartSLOduration=3.517803068 podStartE2EDuration="6.741135572s" podCreationTimestamp="2025-11-27 17:25:46 +0000 UTC" firstStartedPulling="2025-11-27 17:25:48.657528417 +0000 UTC m=+2860.674968717" lastFinishedPulling="2025-11-27 17:25:51.880860921 +0000 UTC m=+2863.898301221" observedRunningTime="2025-11-27 17:25:52.73605496 +0000 UTC m=+2864.753495270" watchObservedRunningTime="2025-11-27 17:25:52.741135572 +0000 UTC m=+2864.758575882" Nov 27 17:25:52 crc kubenswrapper[4954]: I1127 17:25:52.770230 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:53 crc kubenswrapper[4954]: I1127 17:25:53.687855 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:25:53 crc kubenswrapper[4954]: I1127 17:25:53.687951 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:25:54 crc kubenswrapper[4954]: I1127 17:25:54.924189 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:55 crc kubenswrapper[4954]: I1127 17:25:55.742386 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p2mkc" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="registry-server" containerID="cri-o://d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca" gracePeriod=2 Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.227861 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.370687 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities\") pod \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.370838 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content\") pod \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.371037 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwzl8\" (UniqueName: \"kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8\") pod \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\" (UID: \"28cab63c-b97e-4ac9-878a-43f50b88e5cb\") " Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.372623 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities" (OuterVolumeSpecName: "utilities") pod "28cab63c-b97e-4ac9-878a-43f50b88e5cb" (UID: "28cab63c-b97e-4ac9-878a-43f50b88e5cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.378795 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8" (OuterVolumeSpecName: "kube-api-access-lwzl8") pod "28cab63c-b97e-4ac9-878a-43f50b88e5cb" (UID: "28cab63c-b97e-4ac9-878a-43f50b88e5cb"). InnerVolumeSpecName "kube-api-access-lwzl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.420971 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28cab63c-b97e-4ac9-878a-43f50b88e5cb" (UID: "28cab63c-b97e-4ac9-878a-43f50b88e5cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.474085 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwzl8\" (UniqueName: \"kubernetes.io/projected/28cab63c-b97e-4ac9-878a-43f50b88e5cb-kube-api-access-lwzl8\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.474156 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.474172 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28cab63c-b97e-4ac9-878a-43f50b88e5cb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.752513 4954 generic.go:334] "Generic (PLEG): container finished" podID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerID="d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca" exitCode=0 Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.752747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerDied","Data":"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca"} Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.753100 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2mkc" event={"ID":"28cab63c-b97e-4ac9-878a-43f50b88e5cb","Type":"ContainerDied","Data":"47daba527d28a7e429b62f70f7764125bffac27437b32377635ba2b92caef49b"} Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.753134 4954 scope.go:117] "RemoveContainer" containerID="d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.752887 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2mkc" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.780213 4954 scope.go:117] "RemoveContainer" containerID="0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.786314 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.795789 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p2mkc"] Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.812151 4954 scope.go:117] "RemoveContainer" containerID="e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.874178 4954 scope.go:117] "RemoveContainer" containerID="d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca" Nov 27 17:25:56 crc kubenswrapper[4954]: E1127 17:25:56.875001 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca\": container with ID starting with d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca not found: ID does not exist" containerID="d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.875057 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca"} err="failed to get container status \"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca\": rpc error: code = NotFound desc = could not find container \"d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca\": container with ID starting with d9e45be16030bdf30b6e0998800540ddd2796f551a9942876bbe3d8c064ea4ca not found: ID does not exist" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.875102 4954 scope.go:117] "RemoveContainer" containerID="0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c" Nov 27 17:25:56 crc kubenswrapper[4954]: E1127 17:25:56.875421 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c\": container with ID starting with 0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c not found: ID does not exist" containerID="0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.875455 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c"} err="failed to get container status \"0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c\": rpc error: code = NotFound desc = could not find container \"0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c\": container with ID starting with 0bc75203984073ee4ab8bfc2533c5535fc63da1d12350fa2531ee86b0efb737c not found: ID does not exist" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.875476 4954 scope.go:117] "RemoveContainer" containerID="e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255" Nov 27 17:25:56 crc kubenswrapper[4954]: E1127 17:25:56.875989 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255\": container with ID starting with e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255 not found: ID does not exist" containerID="e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255" Nov 27 17:25:56 crc kubenswrapper[4954]: I1127 17:25:56.876018 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255"} err="failed to get container status \"e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255\": rpc error: code = NotFound desc = could not find container \"e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255\": container with ID starting with e8f5df02afe0a2dec91cdfa48dd507a1eb25dfffd750f94563d153e345cc3255 not found: ID does not exist" Nov 27 17:25:57 crc kubenswrapper[4954]: I1127 17:25:57.064787 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:57 crc kubenswrapper[4954]: I1127 17:25:57.064844 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:57 crc kubenswrapper[4954]: I1127 17:25:57.110506 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:57 crc kubenswrapper[4954]: I1127 17:25:57.814883 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:25:58 crc kubenswrapper[4954]: I1127 17:25:58.674021 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" path="/var/lib/kubelet/pods/28cab63c-b97e-4ac9-878a-43f50b88e5cb/volumes" Nov 27 17:25:59 crc kubenswrapper[4954]: I1127 17:25:59.307279 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:25:59 crc kubenswrapper[4954]: I1127 17:25:59.781689 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8qwz8" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="registry-server" containerID="cri-o://544c12f1213e2a4e5006334e3275096fd605ca5a77c764ad2e37375bdca163ea" gracePeriod=2 Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.794860 4954 generic.go:334] "Generic (PLEG): container finished" podID="7254dc05-f77c-4632-bea0-bbf673644239" containerID="544c12f1213e2a4e5006334e3275096fd605ca5a77c764ad2e37375bdca163ea" exitCode=0 Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.795092 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerDied","Data":"544c12f1213e2a4e5006334e3275096fd605ca5a77c764ad2e37375bdca163ea"} Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.795205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qwz8" event={"ID":"7254dc05-f77c-4632-bea0-bbf673644239","Type":"ContainerDied","Data":"c610eb867cd8b88e25474a7d682135b55f5fff62d3ded5bb5eb140afc1d11014"} Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.795223 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c610eb867cd8b88e25474a7d682135b55f5fff62d3ded5bb5eb140afc1d11014" Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.856074 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.968446 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtbsj\" (UniqueName: \"kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj\") pod \"7254dc05-f77c-4632-bea0-bbf673644239\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.968647 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities\") pod \"7254dc05-f77c-4632-bea0-bbf673644239\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.968677 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content\") pod \"7254dc05-f77c-4632-bea0-bbf673644239\" (UID: \"7254dc05-f77c-4632-bea0-bbf673644239\") " Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.969548 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities" (OuterVolumeSpecName: "utilities") pod "7254dc05-f77c-4632-bea0-bbf673644239" (UID: "7254dc05-f77c-4632-bea0-bbf673644239"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:26:00 crc kubenswrapper[4954]: I1127 17:26:00.979001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj" (OuterVolumeSpecName: "kube-api-access-xtbsj") pod "7254dc05-f77c-4632-bea0-bbf673644239" (UID: "7254dc05-f77c-4632-bea0-bbf673644239"). InnerVolumeSpecName "kube-api-access-xtbsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.016604 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7254dc05-f77c-4632-bea0-bbf673644239" (UID: "7254dc05-f77c-4632-bea0-bbf673644239"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.070823 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtbsj\" (UniqueName: \"kubernetes.io/projected/7254dc05-f77c-4632-bea0-bbf673644239-kube-api-access-xtbsj\") on node \"crc\" DevicePath \"\"" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.070885 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.070900 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7254dc05-f77c-4632-bea0-bbf673644239-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.804014 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qwz8" Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.837052 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:26:01 crc kubenswrapper[4954]: I1127 17:26:01.845536 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8qwz8"] Nov 27 17:26:02 crc kubenswrapper[4954]: I1127 17:26:02.674135 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7254dc05-f77c-4632-bea0-bbf673644239" path="/var/lib/kubelet/pods/7254dc05-f77c-4632-bea0-bbf673644239/volumes" Nov 27 17:26:23 crc kubenswrapper[4954]: I1127 17:26:23.687266 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:26:23 crc kubenswrapper[4954]: I1127 17:26:23.687827 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:26:23 crc kubenswrapper[4954]: I1127 17:26:23.687866 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:26:23 crc kubenswrapper[4954]: I1127 17:26:23.688674 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:26:23 crc kubenswrapper[4954]: I1127 17:26:23.688725 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca" gracePeriod=600 Nov 27 17:26:24 crc kubenswrapper[4954]: I1127 17:26:24.033133 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca" exitCode=0 Nov 27 17:26:24 crc kubenswrapper[4954]: I1127 17:26:24.033216 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca"} Nov 27 17:26:24 crc kubenswrapper[4954]: I1127 17:26:24.033681 4954 scope.go:117] "RemoveContainer" containerID="b91da12f8fcb1407df50bd3be19fd43d848b6fc636f3c1096f65accf412d6bd6" Nov 27 17:26:25 crc kubenswrapper[4954]: I1127 17:26:25.047332 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8"} Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.084017 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086173 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="extract-content" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086192 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="extract-content" Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086215 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="extract-content" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086222 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="extract-content" Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086231 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086238 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086269 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="extract-utilities" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086276 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="extract-utilities" Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086307 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="extract-utilities" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086317 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="extract-utilities" Nov 27 17:28:13 crc kubenswrapper[4954]: E1127 17:28:13.086349 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086355 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086786 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cab63c-b97e-4ac9-878a-43f50b88e5cb" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.086808 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7254dc05-f77c-4632-bea0-bbf673644239" containerName="registry-server" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.090104 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.132753 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.232699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbk9g\" (UniqueName: \"kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.232934 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.232975 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.334716 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbk9g\" (UniqueName: \"kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.335226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.335422 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.335779 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.336092 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.357877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbk9g\" (UniqueName: \"kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g\") pod \"redhat-marketplace-sbrs2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.428426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:13 crc kubenswrapper[4954]: I1127 17:28:13.951802 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:14 crc kubenswrapper[4954]: I1127 17:28:14.120590 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerStarted","Data":"d54c3d8d6b75083135a5ef63822f1ee26d860fdec8472c984735b7b5105dafcc"} Nov 27 17:28:15 crc kubenswrapper[4954]: I1127 17:28:15.131223 4954 generic.go:334] "Generic (PLEG): container finished" podID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerID="89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0" exitCode=0 Nov 27 17:28:15 crc kubenswrapper[4954]: I1127 17:28:15.131307 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerDied","Data":"89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0"} Nov 27 17:28:15 crc kubenswrapper[4954]: I1127 17:28:15.135261 4954 generic.go:334] "Generic (PLEG): container finished" podID="200fb5dd-f5ad-4f82-8a9c-e8e378075448" containerID="31f1885ec827cdfa4f1b54cebbae4d4ddeb4cf8117f97db54386e47e3be1478f" exitCode=0 Nov 27 17:28:15 crc kubenswrapper[4954]: I1127 17:28:15.135290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" event={"ID":"200fb5dd-f5ad-4f82-8a9c-e8e378075448","Type":"ContainerDied","Data":"31f1885ec827cdfa4f1b54cebbae4d4ddeb4cf8117f97db54386e47e3be1478f"} Nov 27 17:28:15 crc kubenswrapper[4954]: I1127 17:28:15.135796 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.589544 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661412 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661589 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476jt\" (UniqueName: \"kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661770 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661814 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661880 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.661983 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0\") pod \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\" (UID: \"200fb5dd-f5ad-4f82-8a9c-e8e378075448\") " Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.670195 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.670995 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt" (OuterVolumeSpecName: "kube-api-access-476jt") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "kube-api-access-476jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.697480 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.698553 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory" (OuterVolumeSpecName: "inventory") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.701957 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.704168 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.708070 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "200fb5dd-f5ad-4f82-8a9c-e8e378075448" (UID: "200fb5dd-f5ad-4f82-8a9c-e8e378075448"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.765929 4954 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.765984 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.765997 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.766013 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.766029 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.766042 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/200fb5dd-f5ad-4f82-8a9c-e8e378075448-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:16 crc kubenswrapper[4954]: I1127 17:28:16.766055 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-476jt\" (UniqueName: \"kubernetes.io/projected/200fb5dd-f5ad-4f82-8a9c-e8e378075448-kube-api-access-476jt\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:17 crc kubenswrapper[4954]: I1127 17:28:17.167028 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" Nov 27 17:28:17 crc kubenswrapper[4954]: I1127 17:28:17.167015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r476v" event={"ID":"200fb5dd-f5ad-4f82-8a9c-e8e378075448","Type":"ContainerDied","Data":"b1afdb2682f656d8d0d968ba42e29cdbc3f2a9ebfc39d524769c0732375369a2"} Nov 27 17:28:17 crc kubenswrapper[4954]: I1127 17:28:17.167646 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1afdb2682f656d8d0d968ba42e29cdbc3f2a9ebfc39d524769c0732375369a2" Nov 27 17:28:17 crc kubenswrapper[4954]: I1127 17:28:17.171357 4954 generic.go:334] "Generic (PLEG): container finished" podID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerID="380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1" exitCode=0 Nov 27 17:28:17 crc kubenswrapper[4954]: I1127 17:28:17.171405 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerDied","Data":"380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1"} Nov 27 17:28:18 crc kubenswrapper[4954]: I1127 17:28:18.184083 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerStarted","Data":"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37"} Nov 27 17:28:18 crc kubenswrapper[4954]: I1127 17:28:18.215878 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbrs2" podStartSLOduration=2.438801101 podStartE2EDuration="5.215850053s" podCreationTimestamp="2025-11-27 17:28:13 +0000 UTC" firstStartedPulling="2025-11-27 17:28:15.135482178 +0000 UTC m=+3007.152922478" lastFinishedPulling="2025-11-27 17:28:17.91253113 +0000 UTC m=+3009.929971430" observedRunningTime="2025-11-27 17:28:18.204124562 +0000 UTC m=+3010.221564862" watchObservedRunningTime="2025-11-27 17:28:18.215850053 +0000 UTC m=+3010.233290353" Nov 27 17:28:23 crc kubenswrapper[4954]: I1127 17:28:23.428702 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:23 crc kubenswrapper[4954]: I1127 17:28:23.429664 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:23 crc kubenswrapper[4954]: I1127 17:28:23.485891 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:24 crc kubenswrapper[4954]: I1127 17:28:24.291043 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:24 crc kubenswrapper[4954]: I1127 17:28:24.346247 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.263271 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbrs2" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="registry-server" containerID="cri-o://2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37" gracePeriod=2 Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.714408 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.876163 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities\") pod \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.876308 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbk9g\" (UniqueName: \"kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g\") pod \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.876431 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content\") pod \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\" (UID: \"1c183f76-bf1a-49f4-85b2-3788ac8069c2\") " Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.877683 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities" (OuterVolumeSpecName: "utilities") pod "1c183f76-bf1a-49f4-85b2-3788ac8069c2" (UID: "1c183f76-bf1a-49f4-85b2-3788ac8069c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.883700 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g" (OuterVolumeSpecName: "kube-api-access-cbk9g") pod "1c183f76-bf1a-49f4-85b2-3788ac8069c2" (UID: "1c183f76-bf1a-49f4-85b2-3788ac8069c2"). InnerVolumeSpecName "kube-api-access-cbk9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.896121 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c183f76-bf1a-49f4-85b2-3788ac8069c2" (UID: "1c183f76-bf1a-49f4-85b2-3788ac8069c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.978991 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.979038 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbk9g\" (UniqueName: \"kubernetes.io/projected/1c183f76-bf1a-49f4-85b2-3788ac8069c2-kube-api-access-cbk9g\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:26 crc kubenswrapper[4954]: I1127 17:28:26.979060 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c183f76-bf1a-49f4-85b2-3788ac8069c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.275892 4954 generic.go:334] "Generic (PLEG): container finished" podID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerID="2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37" exitCode=0 Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.275952 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerDied","Data":"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37"} Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.275973 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrs2" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.275991 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrs2" event={"ID":"1c183f76-bf1a-49f4-85b2-3788ac8069c2","Type":"ContainerDied","Data":"d54c3d8d6b75083135a5ef63822f1ee26d860fdec8472c984735b7b5105dafcc"} Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.276013 4954 scope.go:117] "RemoveContainer" containerID="2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.302373 4954 scope.go:117] "RemoveContainer" containerID="380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.321305 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.332106 4954 scope.go:117] "RemoveContainer" containerID="89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.333778 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrs2"] Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.368933 4954 scope.go:117] "RemoveContainer" containerID="2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37" Nov 27 17:28:27 crc kubenswrapper[4954]: E1127 17:28:27.369660 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37\": container with ID starting with 2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37 not found: ID does not exist" containerID="2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.369709 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37"} err="failed to get container status \"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37\": rpc error: code = NotFound desc = could not find container \"2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37\": container with ID starting with 2596927ea4f1bdbae0a7dd6dabf490c176e9825f9df50ec93f8d3d59aab58b37 not found: ID does not exist" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.369741 4954 scope.go:117] "RemoveContainer" containerID="380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1" Nov 27 17:28:27 crc kubenswrapper[4954]: E1127 17:28:27.370051 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1\": container with ID starting with 380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1 not found: ID does not exist" containerID="380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.370163 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1"} err="failed to get container status \"380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1\": rpc error: code = NotFound desc = could not find container \"380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1\": container with ID starting with 380a7d1197c80a7c7d9eb63cc4bdd5f99ad265b1c682b3a462de385aaa1d0ed1 not found: ID does not exist" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.370263 4954 scope.go:117] "RemoveContainer" containerID="89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0" Nov 27 17:28:27 crc kubenswrapper[4954]: E1127 17:28:27.370625 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0\": container with ID starting with 89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0 not found: ID does not exist" containerID="89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0" Nov 27 17:28:27 crc kubenswrapper[4954]: I1127 17:28:27.370655 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0"} err="failed to get container status \"89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0\": rpc error: code = NotFound desc = could not find container \"89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0\": container with ID starting with 89fc00b705e5359d0b86171dd037278fa748daef0f82180c0b629d3699a719a0 not found: ID does not exist" Nov 27 17:28:28 crc kubenswrapper[4954]: I1127 17:28:28.674694 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" path="/var/lib/kubelet/pods/1c183f76-bf1a-49f4-85b2-3788ac8069c2/volumes" Nov 27 17:28:53 crc kubenswrapper[4954]: I1127 17:28:53.687094 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:28:53 crc kubenswrapper[4954]: I1127 17:28:53.688113 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.540699 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 17:29:17 crc kubenswrapper[4954]: E1127 17:29:17.541868 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="registry-server" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.541890 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="registry-server" Nov 27 17:29:17 crc kubenswrapper[4954]: E1127 17:29:17.541908 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200fb5dd-f5ad-4f82-8a9c-e8e378075448" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.541918 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="200fb5dd-f5ad-4f82-8a9c-e8e378075448" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 17:29:17 crc kubenswrapper[4954]: E1127 17:29:17.541928 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="extract-utilities" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.541936 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="extract-utilities" Nov 27 17:29:17 crc kubenswrapper[4954]: E1127 17:29:17.541956 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="extract-content" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.541965 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="extract-content" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.542193 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="200fb5dd-f5ad-4f82-8a9c-e8e378075448" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.542233 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c183f76-bf1a-49f4-85b2-3788ac8069c2" containerName="registry-server" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.542934 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.544854 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.544894 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.546100 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.546220 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s7zd" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.551477 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744096 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744421 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744520 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744701 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpp5r\" (UniqueName: \"kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.744749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.745016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.745113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847108 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847169 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847274 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847311 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpp5r\" (UniqueName: \"kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847344 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847387 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847432 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.847468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.848044 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.848215 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.848808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.849131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.849488 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.854295 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.855564 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.859741 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.868098 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpp5r\" (UniqueName: \"kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:17 crc kubenswrapper[4954]: I1127 17:29:17.877675 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " pod="openstack/tempest-tests-tempest" Nov 27 17:29:18 crc kubenswrapper[4954]: I1127 17:29:18.168218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 17:29:18 crc kubenswrapper[4954]: I1127 17:29:18.600681 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 17:29:18 crc kubenswrapper[4954]: I1127 17:29:18.779351 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae22fda2-42ce-4b9e-9daf-00e886b8449b","Type":"ContainerStarted","Data":"240b1e9786ec018f883d01756f2cb8ac13a79e48a766c3daaa672c15351c95d2"} Nov 27 17:29:23 crc kubenswrapper[4954]: I1127 17:29:23.688031 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:29:23 crc kubenswrapper[4954]: I1127 17:29:23.688570 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:29:53 crc kubenswrapper[4954]: I1127 17:29:53.687871 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:29:53 crc kubenswrapper[4954]: I1127 17:29:53.688575 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:29:53 crc kubenswrapper[4954]: I1127 17:29:53.688638 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:29:53 crc kubenswrapper[4954]: I1127 17:29:53.689720 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:29:53 crc kubenswrapper[4954]: I1127 17:29:53.689774 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" gracePeriod=600 Nov 27 17:29:54 crc kubenswrapper[4954]: I1127 17:29:54.215343 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" exitCode=0 Nov 27 17:29:54 crc kubenswrapper[4954]: I1127 17:29:54.215398 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8"} Nov 27 17:29:54 crc kubenswrapper[4954]: I1127 17:29:54.215438 4954 scope.go:117] "RemoveContainer" containerID="19635e76ffe7804bff520f008326431eec2d15f97844ea4cb7e79bccab8f66ca" Nov 27 17:29:54 crc kubenswrapper[4954]: E1127 17:29:54.297635 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:29:54 crc kubenswrapper[4954]: E1127 17:29:54.375904 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 27 17:29:54 crc kubenswrapper[4954]: E1127 17:29:54.376125 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpp5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ae22fda2-42ce-4b9e-9daf-00e886b8449b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 17:29:54 crc kubenswrapper[4954]: E1127 17:29:54.378123 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" Nov 27 17:29:55 crc kubenswrapper[4954]: I1127 17:29:55.226106 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:29:55 crc kubenswrapper[4954]: E1127 17:29:55.226440 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:29:55 crc kubenswrapper[4954]: E1127 17:29:55.227019 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.148396 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2"] Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.150653 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.153649 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.154188 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.161640 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2"] Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.302721 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.302946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.303014 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7x2g\" (UniqueName: \"kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.404811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.404881 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7x2g\" (UniqueName: \"kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.405041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.406052 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.415334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.428810 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7x2g\" (UniqueName: \"kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g\") pod \"collect-profiles-29404410-n22m2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.472780 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:00 crc kubenswrapper[4954]: I1127 17:30:00.972330 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2"] Nov 27 17:30:00 crc kubenswrapper[4954]: W1127 17:30:00.976292 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc0c6eb_807f_4398_9aaa_dbc452e3bfb2.slice/crio-c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140 WatchSource:0}: Error finding container c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140: Status 404 returned error can't find the container with id c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140 Nov 27 17:30:01 crc kubenswrapper[4954]: I1127 17:30:01.274433 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" event={"ID":"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2","Type":"ContainerStarted","Data":"6ae70634b10612c27ee228ecc24993da9e7db7eed21e6603e68dd0c0e72aedd5"} Nov 27 17:30:01 crc kubenswrapper[4954]: I1127 17:30:01.274816 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" event={"ID":"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2","Type":"ContainerStarted","Data":"c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140"} Nov 27 17:30:01 crc kubenswrapper[4954]: I1127 17:30:01.290866 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" podStartSLOduration=1.290847439 podStartE2EDuration="1.290847439s" podCreationTimestamp="2025-11-27 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:30:01.288947473 +0000 UTC m=+3113.306387773" watchObservedRunningTime="2025-11-27 17:30:01.290847439 +0000 UTC m=+3113.308287739" Nov 27 17:30:02 crc kubenswrapper[4954]: I1127 17:30:02.288884 4954 generic.go:334] "Generic (PLEG): container finished" podID="bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" containerID="6ae70634b10612c27ee228ecc24993da9e7db7eed21e6603e68dd0c0e72aedd5" exitCode=0 Nov 27 17:30:02 crc kubenswrapper[4954]: I1127 17:30:02.288939 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" event={"ID":"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2","Type":"ContainerDied","Data":"6ae70634b10612c27ee228ecc24993da9e7db7eed21e6603e68dd0c0e72aedd5"} Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.631143 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.685471 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume\") pod \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.685906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume\") pod \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.686164 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7x2g\" (UniqueName: \"kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g\") pod \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\" (UID: \"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2\") " Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.686401 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume" (OuterVolumeSpecName: "config-volume") pod "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" (UID: "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.686822 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.691661 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" (UID: "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.692811 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g" (OuterVolumeSpecName: "kube-api-access-k7x2g") pod "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" (UID: "bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2"). InnerVolumeSpecName "kube-api-access-k7x2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.789549 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7x2g\" (UniqueName: \"kubernetes.io/projected/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-kube-api-access-k7x2g\") on node \"crc\" DevicePath \"\"" Nov 27 17:30:03 crc kubenswrapper[4954]: I1127 17:30:03.789625 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.308382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" event={"ID":"bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2","Type":"ContainerDied","Data":"c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140"} Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.308444 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404410-n22m2" Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.308449 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27fa6ddb32b15295197f2fa6ab316a4b6684acb77a336d574cb5204da8a5140" Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.387161 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw"] Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.397414 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-nqxnw"] Nov 27 17:30:04 crc kubenswrapper[4954]: I1127 17:30:04.672336 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc84ac35-9748-4149-a879-fd4aa19ab5fd" path="/var/lib/kubelet/pods/bc84ac35-9748-4149-a879-fd4aa19ab5fd/volumes" Nov 27 17:30:08 crc kubenswrapper[4954]: I1127 17:30:08.671022 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:30:08 crc kubenswrapper[4954]: E1127 17:30:08.671938 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:30:09 crc kubenswrapper[4954]: I1127 17:30:09.264367 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 27 17:30:10 crc kubenswrapper[4954]: I1127 17:30:10.375827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae22fda2-42ce-4b9e-9daf-00e886b8449b","Type":"ContainerStarted","Data":"75ddd3901656912d678bcd71d06e969c89f6d1327310bdf3f394221b898e3dd2"} Nov 27 17:30:10 crc kubenswrapper[4954]: I1127 17:30:10.404075 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.752240556 podStartE2EDuration="54.404052346s" podCreationTimestamp="2025-11-27 17:29:16 +0000 UTC" firstStartedPulling="2025-11-27 17:29:18.609345289 +0000 UTC m=+3070.626785589" lastFinishedPulling="2025-11-27 17:30:09.261157079 +0000 UTC m=+3121.278597379" observedRunningTime="2025-11-27 17:30:10.395165522 +0000 UTC m=+3122.412605832" watchObservedRunningTime="2025-11-27 17:30:10.404052346 +0000 UTC m=+3122.421492646" Nov 27 17:30:19 crc kubenswrapper[4954]: I1127 17:30:19.662041 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:30:19 crc kubenswrapper[4954]: E1127 17:30:19.664061 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:30:31 crc kubenswrapper[4954]: I1127 17:30:31.662526 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:30:31 crc kubenswrapper[4954]: E1127 17:30:31.663463 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:30:42 crc kubenswrapper[4954]: I1127 17:30:42.662154 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:30:42 crc kubenswrapper[4954]: E1127 17:30:42.662879 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:30:47 crc kubenswrapper[4954]: I1127 17:30:47.052797 4954 scope.go:117] "RemoveContainer" containerID="60d34756ad0eebc3153c05ba00e39e84d43c1f80d5cbfccd301ba32a6a47890a" Nov 27 17:30:54 crc kubenswrapper[4954]: I1127 17:30:54.662953 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:30:54 crc kubenswrapper[4954]: E1127 17:30:54.664273 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:31:07 crc kubenswrapper[4954]: I1127 17:31:07.662263 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:31:07 crc kubenswrapper[4954]: E1127 17:31:07.663524 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:31:19 crc kubenswrapper[4954]: I1127 17:31:19.662615 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:31:19 crc kubenswrapper[4954]: E1127 17:31:19.663321 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:31:30 crc kubenswrapper[4954]: I1127 17:31:30.663858 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:31:30 crc kubenswrapper[4954]: E1127 17:31:30.665084 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:31:43 crc kubenswrapper[4954]: I1127 17:31:43.661865 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:31:43 crc kubenswrapper[4954]: E1127 17:31:43.662748 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:31:58 crc kubenswrapper[4954]: I1127 17:31:58.670136 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:31:58 crc kubenswrapper[4954]: E1127 17:31:58.671434 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:32:13 crc kubenswrapper[4954]: I1127 17:32:13.662872 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:32:13 crc kubenswrapper[4954]: E1127 17:32:13.663726 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:32:28 crc kubenswrapper[4954]: I1127 17:32:28.676219 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:32:28 crc kubenswrapper[4954]: E1127 17:32:28.677081 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:32:43 crc kubenswrapper[4954]: I1127 17:32:43.662206 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:32:43 crc kubenswrapper[4954]: E1127 17:32:43.663133 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:32:47 crc kubenswrapper[4954]: I1127 17:32:47.135486 4954 scope.go:117] "RemoveContainer" containerID="38c9fa49421e4db0e9cac3f3f6f3c693910bca9000c0923e879b2a1ef2e16383" Nov 27 17:32:47 crc kubenswrapper[4954]: I1127 17:32:47.161501 4954 scope.go:117] "RemoveContainer" containerID="f6ebf4cd26d5552ee44d48c596536ba6a6de397229c0b57f318fe26dcad03497" Nov 27 17:32:47 crc kubenswrapper[4954]: I1127 17:32:47.210413 4954 scope.go:117] "RemoveContainer" containerID="544c12f1213e2a4e5006334e3275096fd605ca5a77c764ad2e37375bdca163ea" Nov 27 17:32:58 crc kubenswrapper[4954]: I1127 17:32:58.669513 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:32:58 crc kubenswrapper[4954]: E1127 17:32:58.670206 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:33:12 crc kubenswrapper[4954]: I1127 17:33:12.663105 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:33:12 crc kubenswrapper[4954]: E1127 17:33:12.664094 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:33:27 crc kubenswrapper[4954]: I1127 17:33:27.662394 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:33:27 crc kubenswrapper[4954]: E1127 17:33:27.663274 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:33:38 crc kubenswrapper[4954]: I1127 17:33:38.668705 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:33:38 crc kubenswrapper[4954]: E1127 17:33:38.669719 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:33:50 crc kubenswrapper[4954]: I1127 17:33:50.662130 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:33:50 crc kubenswrapper[4954]: E1127 17:33:50.663258 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.666117 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:33:57 crc kubenswrapper[4954]: E1127 17:33:57.667160 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" containerName="collect-profiles" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.667174 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" containerName="collect-profiles" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.667393 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc0c6eb-807f-4398-9aaa-dbc452e3bfb2" containerName="collect-profiles" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.675120 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.694901 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.775077 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t8vc\" (UniqueName: \"kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.775251 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.776320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.879528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.879611 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.879788 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t8vc\" (UniqueName: \"kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.880202 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.880299 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:57 crc kubenswrapper[4954]: I1127 17:33:57.905498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t8vc\" (UniqueName: \"kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc\") pod \"redhat-operators-7w46s\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:58 crc kubenswrapper[4954]: I1127 17:33:58.003669 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:33:58 crc kubenswrapper[4954]: I1127 17:33:58.461531 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:33:58 crc kubenswrapper[4954]: I1127 17:33:58.653939 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerStarted","Data":"9fbbce1e504926ad07468efdfda120f270197d5845b449f0fac99ff36834587a"} Nov 27 17:33:59 crc kubenswrapper[4954]: I1127 17:33:59.664825 4954 generic.go:334] "Generic (PLEG): container finished" podID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerID="d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94" exitCode=0 Nov 27 17:33:59 crc kubenswrapper[4954]: I1127 17:33:59.664880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerDied","Data":"d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94"} Nov 27 17:33:59 crc kubenswrapper[4954]: I1127 17:33:59.667176 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:34:01 crc kubenswrapper[4954]: I1127 17:34:01.685916 4954 generic.go:334] "Generic (PLEG): container finished" podID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerID="7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d" exitCode=0 Nov 27 17:34:01 crc kubenswrapper[4954]: I1127 17:34:01.685958 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerDied","Data":"7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d"} Nov 27 17:34:02 crc kubenswrapper[4954]: I1127 17:34:02.663241 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:34:02 crc kubenswrapper[4954]: E1127 17:34:02.664521 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:34:02 crc kubenswrapper[4954]: I1127 17:34:02.702559 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerStarted","Data":"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848"} Nov 27 17:34:02 crc kubenswrapper[4954]: I1127 17:34:02.723810 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7w46s" podStartSLOduration=3.287195994 podStartE2EDuration="5.723791863s" podCreationTimestamp="2025-11-27 17:33:57 +0000 UTC" firstStartedPulling="2025-11-27 17:33:59.666894714 +0000 UTC m=+3351.684335014" lastFinishedPulling="2025-11-27 17:34:02.103490583 +0000 UTC m=+3354.120930883" observedRunningTime="2025-11-27 17:34:02.720440732 +0000 UTC m=+3354.737881032" watchObservedRunningTime="2025-11-27 17:34:02.723791863 +0000 UTC m=+3354.741232163" Nov 27 17:34:08 crc kubenswrapper[4954]: I1127 17:34:08.004208 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:08 crc kubenswrapper[4954]: I1127 17:34:08.004825 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:08 crc kubenswrapper[4954]: I1127 17:34:08.050425 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:08 crc kubenswrapper[4954]: I1127 17:34:08.796439 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:08 crc kubenswrapper[4954]: I1127 17:34:08.839747 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:34:10 crc kubenswrapper[4954]: I1127 17:34:10.767612 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7w46s" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="registry-server" containerID="cri-o://cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848" gracePeriod=2 Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.302703 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.372095 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content\") pod \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.372565 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t8vc\" (UniqueName: \"kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc\") pod \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.372773 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities\") pod \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\" (UID: \"e8d28e45-f0d4-4a77-bc9e-28d532f539c8\") " Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.373875 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities" (OuterVolumeSpecName: "utilities") pod "e8d28e45-f0d4-4a77-bc9e-28d532f539c8" (UID: "e8d28e45-f0d4-4a77-bc9e-28d532f539c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.378921 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc" (OuterVolumeSpecName: "kube-api-access-2t8vc") pod "e8d28e45-f0d4-4a77-bc9e-28d532f539c8" (UID: "e8d28e45-f0d4-4a77-bc9e-28d532f539c8"). InnerVolumeSpecName "kube-api-access-2t8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.475205 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t8vc\" (UniqueName: \"kubernetes.io/projected/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-kube-api-access-2t8vc\") on node \"crc\" DevicePath \"\"" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.475246 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.780076 4954 generic.go:334] "Generic (PLEG): container finished" podID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerID="cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848" exitCode=0 Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.780202 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerDied","Data":"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848"} Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.780241 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7w46s" event={"ID":"e8d28e45-f0d4-4a77-bc9e-28d532f539c8","Type":"ContainerDied","Data":"9fbbce1e504926ad07468efdfda120f270197d5845b449f0fac99ff36834587a"} Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.780295 4954 scope.go:117] "RemoveContainer" containerID="cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.780441 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7w46s" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.804076 4954 scope.go:117] "RemoveContainer" containerID="7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.829226 4954 scope.go:117] "RemoveContainer" containerID="d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.868901 4954 scope.go:117] "RemoveContainer" containerID="cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848" Nov 27 17:34:11 crc kubenswrapper[4954]: E1127 17:34:11.869499 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848\": container with ID starting with cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848 not found: ID does not exist" containerID="cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.869543 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848"} err="failed to get container status \"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848\": rpc error: code = NotFound desc = could not find container \"cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848\": container with ID starting with cecd0c8977adb3a048cb3a82ec5506c844eb4fae89aa251d0a82f81018760848 not found: ID does not exist" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.869572 4954 scope.go:117] "RemoveContainer" containerID="7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d" Nov 27 17:34:11 crc kubenswrapper[4954]: E1127 17:34:11.870008 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d\": container with ID starting with 7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d not found: ID does not exist" containerID="7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.870029 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d"} err="failed to get container status \"7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d\": rpc error: code = NotFound desc = could not find container \"7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d\": container with ID starting with 7c695b75d12c3271d408c9f2d9f3f2696daa476456a35de03e270fc82c54d21d not found: ID does not exist" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.870045 4954 scope.go:117] "RemoveContainer" containerID="d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94" Nov 27 17:34:11 crc kubenswrapper[4954]: E1127 17:34:11.870326 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94\": container with ID starting with d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94 not found: ID does not exist" containerID="d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94" Nov 27 17:34:11 crc kubenswrapper[4954]: I1127 17:34:11.870350 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94"} err="failed to get container status \"d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94\": rpc error: code = NotFound desc = could not find container \"d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94\": container with ID starting with d15dad0901d4331e7e8859a3816990495357c97fb8acc33bdfb238a2c22d9f94 not found: ID does not exist" Nov 27 17:34:13 crc kubenswrapper[4954]: I1127 17:34:13.644310 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8d28e45-f0d4-4a77-bc9e-28d532f539c8" (UID: "e8d28e45-f0d4-4a77-bc9e-28d532f539c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:34:13 crc kubenswrapper[4954]: I1127 17:34:13.718404 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d28e45-f0d4-4a77-bc9e-28d532f539c8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:34:13 crc kubenswrapper[4954]: I1127 17:34:13.924769 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:34:13 crc kubenswrapper[4954]: I1127 17:34:13.941360 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7w46s"] Nov 27 17:34:14 crc kubenswrapper[4954]: I1127 17:34:14.662454 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:34:14 crc kubenswrapper[4954]: E1127 17:34:14.663110 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:34:14 crc kubenswrapper[4954]: I1127 17:34:14.672031 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" path="/var/lib/kubelet/pods/e8d28e45-f0d4-4a77-bc9e-28d532f539c8/volumes" Nov 27 17:34:28 crc kubenswrapper[4954]: I1127 17:34:28.667895 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:34:28 crc kubenswrapper[4954]: E1127 17:34:28.668738 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:34:43 crc kubenswrapper[4954]: I1127 17:34:43.662326 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:34:43 crc kubenswrapper[4954]: E1127 17:34:43.663254 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:34:58 crc kubenswrapper[4954]: I1127 17:34:58.667843 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:34:59 crc kubenswrapper[4954]: I1127 17:34:59.676213 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9"} Nov 27 17:37:23 crc kubenswrapper[4954]: I1127 17:37:23.687192 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:37:23 crc kubenswrapper[4954]: I1127 17:37:23.688294 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:37:53 crc kubenswrapper[4954]: I1127 17:37:53.687831 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:37:53 crc kubenswrapper[4954]: I1127 17:37:53.688395 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:38:23 crc kubenswrapper[4954]: I1127 17:38:23.687900 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:38:23 crc kubenswrapper[4954]: I1127 17:38:23.688477 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:38:23 crc kubenswrapper[4954]: I1127 17:38:23.688532 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:38:23 crc kubenswrapper[4954]: I1127 17:38:23.689297 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:38:23 crc kubenswrapper[4954]: I1127 17:38:23.689351 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9" gracePeriod=600 Nov 27 17:38:24 crc kubenswrapper[4954]: I1127 17:38:24.747241 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9" exitCode=0 Nov 27 17:38:24 crc kubenswrapper[4954]: I1127 17:38:24.747297 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9"} Nov 27 17:38:24 crc kubenswrapper[4954]: I1127 17:38:24.747780 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8"} Nov 27 17:38:24 crc kubenswrapper[4954]: I1127 17:38:24.747800 4954 scope.go:117] "RemoveContainer" containerID="ef478800b18de3e4c1454aea1f2bd63c211721709f73b298ab96243c53a363e8" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.144376 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:09 crc kubenswrapper[4954]: E1127 17:39:09.148363 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="registry-server" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.148395 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="registry-server" Nov 27 17:39:09 crc kubenswrapper[4954]: E1127 17:39:09.148426 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="extract-utilities" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.148439 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="extract-utilities" Nov 27 17:39:09 crc kubenswrapper[4954]: E1127 17:39:09.148462 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="extract-content" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.148471 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="extract-content" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.148735 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d28e45-f0d4-4a77-bc9e-28d532f539c8" containerName="registry-server" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.150562 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.161936 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.206195 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.206246 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.206474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkrl\" (UniqueName: \"kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.308480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkrl\" (UniqueName: \"kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.308679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.308707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.309154 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.309212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.329863 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkrl\" (UniqueName: \"kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl\") pod \"redhat-marketplace-6dnqj\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.484325 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:09 crc kubenswrapper[4954]: I1127 17:39:09.950643 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:10 crc kubenswrapper[4954]: I1127 17:39:10.217757 4954 generic.go:334] "Generic (PLEG): container finished" podID="235c6400-d862-42c9-a57d-81e4fff335be" containerID="9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0" exitCode=0 Nov 27 17:39:10 crc kubenswrapper[4954]: I1127 17:39:10.217799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerDied","Data":"9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0"} Nov 27 17:39:10 crc kubenswrapper[4954]: I1127 17:39:10.217822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerStarted","Data":"cb02f3d4cc5a3b80c9fc28600da1d361ed74e5e76ef1ec11a9e02417e287ec89"} Nov 27 17:39:10 crc kubenswrapper[4954]: I1127 17:39:10.219526 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:39:11 crc kubenswrapper[4954]: I1127 17:39:11.230860 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerStarted","Data":"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d"} Nov 27 17:39:12 crc kubenswrapper[4954]: I1127 17:39:12.240237 4954 generic.go:334] "Generic (PLEG): container finished" podID="235c6400-d862-42c9-a57d-81e4fff335be" containerID="8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d" exitCode=0 Nov 27 17:39:12 crc kubenswrapper[4954]: I1127 17:39:12.240304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerDied","Data":"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d"} Nov 27 17:39:13 crc kubenswrapper[4954]: I1127 17:39:13.251975 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerStarted","Data":"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e"} Nov 27 17:39:13 crc kubenswrapper[4954]: I1127 17:39:13.274739 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dnqj" podStartSLOduration=1.8000630549999999 podStartE2EDuration="4.274719921s" podCreationTimestamp="2025-11-27 17:39:09 +0000 UTC" firstStartedPulling="2025-11-27 17:39:10.219280582 +0000 UTC m=+3662.236720882" lastFinishedPulling="2025-11-27 17:39:12.693937438 +0000 UTC m=+3664.711377748" observedRunningTime="2025-11-27 17:39:13.266536263 +0000 UTC m=+3665.283976603" watchObservedRunningTime="2025-11-27 17:39:13.274719921 +0000 UTC m=+3665.292160221" Nov 27 17:39:19 crc kubenswrapper[4954]: I1127 17:39:19.485334 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:19 crc kubenswrapper[4954]: I1127 17:39:19.485919 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:19 crc kubenswrapper[4954]: I1127 17:39:19.536413 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:20 crc kubenswrapper[4954]: I1127 17:39:20.370247 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:20 crc kubenswrapper[4954]: I1127 17:39:20.416433 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.326239 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dnqj" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="registry-server" containerID="cri-o://9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e" gracePeriod=2 Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.758621 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.961437 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkrl\" (UniqueName: \"kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl\") pod \"235c6400-d862-42c9-a57d-81e4fff335be\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.962344 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities\") pod \"235c6400-d862-42c9-a57d-81e4fff335be\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.962476 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content\") pod \"235c6400-d862-42c9-a57d-81e4fff335be\" (UID: \"235c6400-d862-42c9-a57d-81e4fff335be\") " Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.963226 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities" (OuterVolumeSpecName: "utilities") pod "235c6400-d862-42c9-a57d-81e4fff335be" (UID: "235c6400-d862-42c9-a57d-81e4fff335be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.967669 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl" (OuterVolumeSpecName: "kube-api-access-vzkrl") pod "235c6400-d862-42c9-a57d-81e4fff335be" (UID: "235c6400-d862-42c9-a57d-81e4fff335be"). InnerVolumeSpecName "kube-api-access-vzkrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:39:22 crc kubenswrapper[4954]: I1127 17:39:22.981752 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "235c6400-d862-42c9-a57d-81e4fff335be" (UID: "235c6400-d862-42c9-a57d-81e4fff335be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.064975 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.065001 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235c6400-d862-42c9-a57d-81e4fff335be-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.065011 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkrl\" (UniqueName: \"kubernetes.io/projected/235c6400-d862-42c9-a57d-81e4fff335be-kube-api-access-vzkrl\") on node \"crc\" DevicePath \"\"" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.337822 4954 generic.go:334] "Generic (PLEG): container finished" podID="235c6400-d862-42c9-a57d-81e4fff335be" containerID="9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e" exitCode=0 Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.337884 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerDied","Data":"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e"} Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.337917 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dnqj" event={"ID":"235c6400-d862-42c9-a57d-81e4fff335be","Type":"ContainerDied","Data":"cb02f3d4cc5a3b80c9fc28600da1d361ed74e5e76ef1ec11a9e02417e287ec89"} Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.337913 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dnqj" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.337930 4954 scope.go:117] "RemoveContainer" containerID="9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.365675 4954 scope.go:117] "RemoveContainer" containerID="8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.374087 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.381934 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dnqj"] Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.391382 4954 scope.go:117] "RemoveContainer" containerID="9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.434188 4954 scope.go:117] "RemoveContainer" containerID="9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e" Nov 27 17:39:23 crc kubenswrapper[4954]: E1127 17:39:23.434754 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e\": container with ID starting with 9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e not found: ID does not exist" containerID="9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.434794 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e"} err="failed to get container status \"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e\": rpc error: code = NotFound desc = could not find container \"9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e\": container with ID starting with 9cff7ab0851308262e35bc93366d85a79153d4f30768814c31000b391004197e not found: ID does not exist" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.434820 4954 scope.go:117] "RemoveContainer" containerID="8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d" Nov 27 17:39:23 crc kubenswrapper[4954]: E1127 17:39:23.435270 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d\": container with ID starting with 8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d not found: ID does not exist" containerID="8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.435293 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d"} err="failed to get container status \"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d\": rpc error: code = NotFound desc = could not find container \"8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d\": container with ID starting with 8e72f43bed26b81ddf2ef539be9367c789875e19966a5f45ddd1070753951c5d not found: ID does not exist" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.435313 4954 scope.go:117] "RemoveContainer" containerID="9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0" Nov 27 17:39:23 crc kubenswrapper[4954]: E1127 17:39:23.435758 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0\": container with ID starting with 9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0 not found: ID does not exist" containerID="9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0" Nov 27 17:39:23 crc kubenswrapper[4954]: I1127 17:39:23.435812 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0"} err="failed to get container status \"9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0\": rpc error: code = NotFound desc = could not find container \"9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0\": container with ID starting with 9e5687443c5f193ce7101c7a3066b4795970351a2197f40246c9a7869d3d4fd0 not found: ID does not exist" Nov 27 17:39:24 crc kubenswrapper[4954]: I1127 17:39:24.671749 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235c6400-d862-42c9-a57d-81e4fff335be" path="/var/lib/kubelet/pods/235c6400-d862-42c9-a57d-81e4fff335be/volumes" Nov 27 17:40:53 crc kubenswrapper[4954]: I1127 17:40:53.687056 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:40:53 crc kubenswrapper[4954]: I1127 17:40:53.687650 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:41:22 crc kubenswrapper[4954]: I1127 17:41:22.337541 4954 generic.go:334] "Generic (PLEG): container finished" podID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" containerID="75ddd3901656912d678bcd71d06e969c89f6d1327310bdf3f394221b898e3dd2" exitCode=0 Nov 27 17:41:22 crc kubenswrapper[4954]: I1127 17:41:22.337628 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae22fda2-42ce-4b9e-9daf-00e886b8449b","Type":"ContainerDied","Data":"75ddd3901656912d678bcd71d06e969c89f6d1327310bdf3f394221b898e3dd2"} Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.687286 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.688486 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.738981 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923671 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpp5r\" (UniqueName: \"kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923769 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923796 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923835 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923874 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923923 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923954 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.923977 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.924052 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data\") pod \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\" (UID: \"ae22fda2-42ce-4b9e-9daf-00e886b8449b\") " Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.925774 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.926156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data" (OuterVolumeSpecName: "config-data") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.929682 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.930120 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r" (OuterVolumeSpecName: "kube-api-access-bpp5r") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "kube-api-access-bpp5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.947090 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.958171 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.975651 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.986983 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:41:23 crc kubenswrapper[4954]: I1127 17:41:23.990789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ae22fda2-42ce-4b9e-9daf-00e886b8449b" (UID: "ae22fda2-42ce-4b9e-9daf-00e886b8449b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.026869 4954 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.026933 4954 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.026946 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.026960 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.026973 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpp5r\" (UniqueName: \"kubernetes.io/projected/ae22fda2-42ce-4b9e-9daf-00e886b8449b-kube-api-access-bpp5r\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.027015 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.027028 4954 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae22fda2-42ce-4b9e-9daf-00e886b8449b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.027040 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae22fda2-42ce-4b9e-9daf-00e886b8449b-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.027051 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae22fda2-42ce-4b9e-9daf-00e886b8449b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.048879 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.128508 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.360710 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae22fda2-42ce-4b9e-9daf-00e886b8449b","Type":"ContainerDied","Data":"240b1e9786ec018f883d01756f2cb8ac13a79e48a766c3daaa672c15351c95d2"} Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.361041 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="240b1e9786ec018f883d01756f2cb8ac13a79e48a766c3daaa672c15351c95d2" Nov 27 17:41:24 crc kubenswrapper[4954]: I1127 17:41:24.360803 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 17:41:24 crc kubenswrapper[4954]: E1127 17:41:24.570028 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae22fda2_42ce_4b9e_9daf_00e886b8449b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae22fda2_42ce_4b9e_9daf_00e886b8449b.slice/crio-240b1e9786ec018f883d01756f2cb8ac13a79e48a766c3daaa672c15351c95d2\": RecentStats: unable to find data in memory cache]" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.408245 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 17:41:30 crc kubenswrapper[4954]: E1127 17:41:30.409698 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="extract-content" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.409722 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="extract-content" Nov 27 17:41:30 crc kubenswrapper[4954]: E1127 17:41:30.409745 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="extract-utilities" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.409757 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="extract-utilities" Nov 27 17:41:30 crc kubenswrapper[4954]: E1127 17:41:30.409806 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="registry-server" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.409817 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="registry-server" Nov 27 17:41:30 crc kubenswrapper[4954]: E1127 17:41:30.409832 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" containerName="tempest-tests-tempest-tests-runner" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.409843 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" containerName="tempest-tests-tempest-tests-runner" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.410141 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae22fda2-42ce-4b9e-9daf-00e886b8449b" containerName="tempest-tests-tempest-tests-runner" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.410183 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="235c6400-d862-42c9-a57d-81e4fff335be" containerName="registry-server" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.411233 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.415382 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s7zd" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.418565 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.546106 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77rc\" (UniqueName: \"kubernetes.io/projected/9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c-kube-api-access-s77rc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.546283 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.648692 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77rc\" (UniqueName: \"kubernetes.io/projected/9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c-kube-api-access-s77rc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.648786 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.649956 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.678521 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.679276 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77rc\" (UniqueName: \"kubernetes.io/projected/9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c-kube-api-access-s77rc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:30 crc kubenswrapper[4954]: I1127 17:41:30.733382 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 17:41:31 crc kubenswrapper[4954]: I1127 17:41:31.147386 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 17:41:31 crc kubenswrapper[4954]: I1127 17:41:31.423755 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c","Type":"ContainerStarted","Data":"6fd00fce53702466480620840c27f41d9006ff14b0c6bf753401acc76cdb8e47"} Nov 27 17:41:32 crc kubenswrapper[4954]: I1127 17:41:32.440547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c","Type":"ContainerStarted","Data":"a630a98fcdafd6a30ff45087c2d450358bb20ea3c1e49104ed855453b5badba5"} Nov 27 17:41:32 crc kubenswrapper[4954]: I1127 17:41:32.468377 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.510664533 podStartE2EDuration="2.46835231s" podCreationTimestamp="2025-11-27 17:41:30 +0000 UTC" firstStartedPulling="2025-11-27 17:41:31.150605348 +0000 UTC m=+3803.168045658" lastFinishedPulling="2025-11-27 17:41:32.108293125 +0000 UTC m=+3804.125733435" observedRunningTime="2025-11-27 17:41:32.454059263 +0000 UTC m=+3804.471499563" watchObservedRunningTime="2025-11-27 17:41:32.46835231 +0000 UTC m=+3804.485792610" Nov 27 17:41:53 crc kubenswrapper[4954]: I1127 17:41:53.687958 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:41:53 crc kubenswrapper[4954]: I1127 17:41:53.688506 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:41:53 crc kubenswrapper[4954]: I1127 17:41:53.688601 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:41:53 crc kubenswrapper[4954]: I1127 17:41:53.689329 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:41:53 crc kubenswrapper[4954]: I1127 17:41:53.689383 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" gracePeriod=600 Nov 27 17:41:53 crc kubenswrapper[4954]: E1127 17:41:53.812884 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:41:54 crc kubenswrapper[4954]: I1127 17:41:54.630784 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" exitCode=0 Nov 27 17:41:54 crc kubenswrapper[4954]: I1127 17:41:54.630868 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8"} Nov 27 17:41:54 crc kubenswrapper[4954]: I1127 17:41:54.631171 4954 scope.go:117] "RemoveContainer" containerID="1f967779ed2a2b60156cf9c239c5283e4d72f05b387203f3a130ac9d15772cd9" Nov 27 17:41:54 crc kubenswrapper[4954]: I1127 17:41:54.631982 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:41:54 crc kubenswrapper[4954]: E1127 17:41:54.632301 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.599321 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pt6gj/must-gather-pgxzg"] Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.601303 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.603081 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pt6gj"/"default-dockercfg-4jgjg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.603276 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pt6gj"/"kube-root-ca.crt" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.603917 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pt6gj"/"openshift-service-ca.crt" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.612905 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pt6gj/must-gather-pgxzg"] Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.744705 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlx4\" (UniqueName: \"kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.745351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.846973 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.847057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlx4\" (UniqueName: \"kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.847523 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.868212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlx4\" (UniqueName: \"kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4\") pod \"must-gather-pgxzg\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:55 crc kubenswrapper[4954]: I1127 17:41:55.926179 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:41:56 crc kubenswrapper[4954]: I1127 17:41:56.474841 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pt6gj/must-gather-pgxzg"] Nov 27 17:41:56 crc kubenswrapper[4954]: I1127 17:41:56.659938 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" event={"ID":"6f492292-1e0a-4fff-b47a-80d1da52652b","Type":"ContainerStarted","Data":"e26c79e1496c8a11c5e23730de178d24758cc2183bc6b9a006d6f9c2e62de062"} Nov 27 17:42:02 crc kubenswrapper[4954]: I1127 17:42:02.720427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" event={"ID":"6f492292-1e0a-4fff-b47a-80d1da52652b","Type":"ContainerStarted","Data":"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff"} Nov 27 17:42:03 crc kubenswrapper[4954]: I1127 17:42:03.730334 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" event={"ID":"6f492292-1e0a-4fff-b47a-80d1da52652b","Type":"ContainerStarted","Data":"5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f"} Nov 27 17:42:03 crc kubenswrapper[4954]: I1127 17:42:03.755561 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" podStartSLOduration=3.001066826 podStartE2EDuration="8.755541503s" podCreationTimestamp="2025-11-27 17:41:55 +0000 UTC" firstStartedPulling="2025-11-27 17:41:56.482235425 +0000 UTC m=+3828.499675725" lastFinishedPulling="2025-11-27 17:42:02.236710102 +0000 UTC m=+3834.254150402" observedRunningTime="2025-11-27 17:42:03.7492879 +0000 UTC m=+3835.766728200" watchObservedRunningTime="2025-11-27 17:42:03.755541503 +0000 UTC m=+3835.772981803" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.110608 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-svngd"] Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.112505 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.258523 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph58l\" (UniqueName: \"kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.258829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.361612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph58l\" (UniqueName: \"kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.361731 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.361912 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.388110 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph58l\" (UniqueName: \"kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l\") pod \"crc-debug-svngd\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.434050 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:42:06 crc kubenswrapper[4954]: W1127 17:42:06.463503 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80ce6116_8884_4604_8e4c_52bfcd16bb22.slice/crio-3b7e5714451786f54833a2e9a8e7dfcf6c8ae867c30272099b24e48b9baaf677 WatchSource:0}: Error finding container 3b7e5714451786f54833a2e9a8e7dfcf6c8ae867c30272099b24e48b9baaf677: Status 404 returned error can't find the container with id 3b7e5714451786f54833a2e9a8e7dfcf6c8ae867c30272099b24e48b9baaf677 Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.662255 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:42:06 crc kubenswrapper[4954]: E1127 17:42:06.662533 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:42:06 crc kubenswrapper[4954]: I1127 17:42:06.756148 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-svngd" event={"ID":"80ce6116-8884-4604-8e4c-52bfcd16bb22","Type":"ContainerStarted","Data":"3b7e5714451786f54833a2e9a8e7dfcf6c8ae867c30272099b24e48b9baaf677"} Nov 27 17:42:18 crc kubenswrapper[4954]: I1127 17:42:18.862701 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-svngd" event={"ID":"80ce6116-8884-4604-8e4c-52bfcd16bb22","Type":"ContainerStarted","Data":"fe03baad42b469642d2a07a7f9d2edf1395768eb53d85c0b9187e26304d82447"} Nov 27 17:42:18 crc kubenswrapper[4954]: I1127 17:42:18.888256 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pt6gj/crc-debug-svngd" podStartSLOduration=1.012744517 podStartE2EDuration="12.888234982s" podCreationTimestamp="2025-11-27 17:42:06 +0000 UTC" firstStartedPulling="2025-11-27 17:42:06.466048477 +0000 UTC m=+3838.483488767" lastFinishedPulling="2025-11-27 17:42:18.341538932 +0000 UTC m=+3850.358979232" observedRunningTime="2025-11-27 17:42:18.878810944 +0000 UTC m=+3850.896251274" watchObservedRunningTime="2025-11-27 17:42:18.888234982 +0000 UTC m=+3850.905675292" Nov 27 17:42:20 crc kubenswrapper[4954]: I1127 17:42:20.662879 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:42:20 crc kubenswrapper[4954]: E1127 17:42:20.663763 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:42:34 crc kubenswrapper[4954]: I1127 17:42:34.662911 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:42:34 crc kubenswrapper[4954]: E1127 17:42:34.663729 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:42:49 crc kubenswrapper[4954]: I1127 17:42:49.662104 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:42:49 crc kubenswrapper[4954]: E1127 17:42:49.663916 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:42:59 crc kubenswrapper[4954]: I1127 17:42:59.224879 4954 generic.go:334] "Generic (PLEG): container finished" podID="80ce6116-8884-4604-8e4c-52bfcd16bb22" containerID="fe03baad42b469642d2a07a7f9d2edf1395768eb53d85c0b9187e26304d82447" exitCode=0 Nov 27 17:42:59 crc kubenswrapper[4954]: I1127 17:42:59.224966 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-svngd" event={"ID":"80ce6116-8884-4604-8e4c-52bfcd16bb22","Type":"ContainerDied","Data":"fe03baad42b469642d2a07a7f9d2edf1395768eb53d85c0b9187e26304d82447"} Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.338970 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.369344 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-svngd"] Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.377094 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-svngd"] Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.440146 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host\") pod \"80ce6116-8884-4604-8e4c-52bfcd16bb22\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.440225 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph58l\" (UniqueName: \"kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l\") pod \"80ce6116-8884-4604-8e4c-52bfcd16bb22\" (UID: \"80ce6116-8884-4604-8e4c-52bfcd16bb22\") " Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.440257 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host" (OuterVolumeSpecName: "host") pod "80ce6116-8884-4604-8e4c-52bfcd16bb22" (UID: "80ce6116-8884-4604-8e4c-52bfcd16bb22"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.440755 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ce6116-8884-4604-8e4c-52bfcd16bb22-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.445861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l" (OuterVolumeSpecName: "kube-api-access-ph58l") pod "80ce6116-8884-4604-8e4c-52bfcd16bb22" (UID: "80ce6116-8884-4604-8e4c-52bfcd16bb22"). InnerVolumeSpecName "kube-api-access-ph58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.542985 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph58l\" (UniqueName: \"kubernetes.io/projected/80ce6116-8884-4604-8e4c-52bfcd16bb22-kube-api-access-ph58l\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:00 crc kubenswrapper[4954]: I1127 17:43:00.672735 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ce6116-8884-4604-8e4c-52bfcd16bb22" path="/var/lib/kubelet/pods/80ce6116-8884-4604-8e4c-52bfcd16bb22/volumes" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.248222 4954 scope.go:117] "RemoveContainer" containerID="fe03baad42b469642d2a07a7f9d2edf1395768eb53d85c0b9187e26304d82447" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.248323 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-svngd" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.529874 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-bvqcm"] Nov 27 17:43:01 crc kubenswrapper[4954]: E1127 17:43:01.531815 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ce6116-8884-4604-8e4c-52bfcd16bb22" containerName="container-00" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.531928 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ce6116-8884-4604-8e4c-52bfcd16bb22" containerName="container-00" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.532197 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ce6116-8884-4604-8e4c-52bfcd16bb22" containerName="container-00" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.532928 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.661855 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.661971 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrjc\" (UniqueName: \"kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.662626 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:43:01 crc kubenswrapper[4954]: E1127 17:43:01.662904 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.762993 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrjc\" (UniqueName: \"kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.763110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.765206 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.812699 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrjc\" (UniqueName: \"kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc\") pod \"crc-debug-bvqcm\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:01 crc kubenswrapper[4954]: I1127 17:43:01.851056 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:02 crc kubenswrapper[4954]: I1127 17:43:02.262738 4954 generic.go:334] "Generic (PLEG): container finished" podID="88427ef6-c01d-4a1e-adc1-ba6262d3693d" containerID="4b2f370bf0be065e984e748dec0e54df9dbaeb912cdbb02c536d764cce3fc0b2" exitCode=0 Nov 27 17:43:02 crc kubenswrapper[4954]: I1127 17:43:02.262843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" event={"ID":"88427ef6-c01d-4a1e-adc1-ba6262d3693d","Type":"ContainerDied","Data":"4b2f370bf0be065e984e748dec0e54df9dbaeb912cdbb02c536d764cce3fc0b2"} Nov 27 17:43:02 crc kubenswrapper[4954]: I1127 17:43:02.263200 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" event={"ID":"88427ef6-c01d-4a1e-adc1-ba6262d3693d","Type":"ContainerStarted","Data":"995c0ca49cc524089a44572e3745276df489f17104bb3df9acc829d9d4d4fd87"} Nov 27 17:43:02 crc kubenswrapper[4954]: I1127 17:43:02.807738 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-bvqcm"] Nov 27 17:43:02 crc kubenswrapper[4954]: I1127 17:43:02.816858 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-bvqcm"] Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.372831 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.497798 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrjc\" (UniqueName: \"kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc\") pod \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.498146 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host\") pod \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\" (UID: \"88427ef6-c01d-4a1e-adc1-ba6262d3693d\") " Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.498290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host" (OuterVolumeSpecName: "host") pod "88427ef6-c01d-4a1e-adc1-ba6262d3693d" (UID: "88427ef6-c01d-4a1e-adc1-ba6262d3693d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.498987 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88427ef6-c01d-4a1e-adc1-ba6262d3693d-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.502862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc" (OuterVolumeSpecName: "kube-api-access-tgrjc") pod "88427ef6-c01d-4a1e-adc1-ba6262d3693d" (UID: "88427ef6-c01d-4a1e-adc1-ba6262d3693d"). InnerVolumeSpecName "kube-api-access-tgrjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.600500 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrjc\" (UniqueName: \"kubernetes.io/projected/88427ef6-c01d-4a1e-adc1-ba6262d3693d-kube-api-access-tgrjc\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.788700 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:03 crc kubenswrapper[4954]: E1127 17:43:03.789197 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88427ef6-c01d-4a1e-adc1-ba6262d3693d" containerName="container-00" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.789217 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="88427ef6-c01d-4a1e-adc1-ba6262d3693d" containerName="container-00" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.789476 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="88427ef6-c01d-4a1e-adc1-ba6262d3693d" containerName="container-00" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.790881 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.800162 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.905288 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmn4\" (UniqueName: \"kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.905606 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.905678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.986562 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.988541 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:03 crc kubenswrapper[4954]: I1127 17:43:03.998354 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.006792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmn4\" (UniqueName: \"kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.006867 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.006920 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.007369 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.007842 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.018096 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-zndqf"] Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.019263 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.060355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmn4\" (UniqueName: \"kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4\") pod \"community-operators-fzxf9\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.108626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.108677 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795n4\" (UniqueName: \"kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.108714 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.108784 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqp2\" (UniqueName: \"kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.108847 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.117047 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.211801 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.211852 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795n4\" (UniqueName: \"kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.211896 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.211988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqp2\" (UniqueName: \"kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.212066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.212066 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.212402 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.212668 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.228223 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqp2\" (UniqueName: \"kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2\") pod \"certified-operators-jjjh5\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.231848 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795n4\" (UniqueName: \"kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4\") pod \"crc-debug-zndqf\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.287224 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995c0ca49cc524089a44572e3745276df489f17104bb3df9acc829d9d4d4fd87" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.287308 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-bvqcm" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.308937 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.383520 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:04 crc kubenswrapper[4954]: W1127 17:43:04.498803 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6e4834_a3ef_4c61_b228_5a23a439395f.slice/crio-e7d5dae02d6bb552eb113d57475687667b392ff037d18327d74b3265a90466e5 WatchSource:0}: Error finding container e7d5dae02d6bb552eb113d57475687667b392ff037d18327d74b3265a90466e5: Status 404 returned error can't find the container with id e7d5dae02d6bb552eb113d57475687667b392ff037d18327d74b3265a90466e5 Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.690017 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88427ef6-c01d-4a1e-adc1-ba6262d3693d" path="/var/lib/kubelet/pods/88427ef6-c01d-4a1e-adc1-ba6262d3693d/volumes" Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.784931 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:04 crc kubenswrapper[4954]: I1127 17:43:04.894322 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:04 crc kubenswrapper[4954]: W1127 17:43:04.911521 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca2d7a5_ab0c_4a4e_ad9d_e856be549017.slice/crio-b9a05a577bcae03e8638a1d8824e6c482913f817aa932b7be5a798f915ae4b0e WatchSource:0}: Error finding container b9a05a577bcae03e8638a1d8824e6c482913f817aa932b7be5a798f915ae4b0e: Status 404 returned error can't find the container with id b9a05a577bcae03e8638a1d8824e6c482913f817aa932b7be5a798f915ae4b0e Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.309232 4954 generic.go:334] "Generic (PLEG): container finished" podID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerID="b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68" exitCode=0 Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.309324 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerDied","Data":"b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.311129 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerStarted","Data":"9a3d809c889ec2769cabe2884eb5acb3bad410a199c9bee779f593f3d00cbf31"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.314114 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerID="33d4857c952ea09911545667b78d27170bd36d594c5b51d1dad4ab0f0a55565d" exitCode=0 Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.314338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerDied","Data":"33d4857c952ea09911545667b78d27170bd36d594c5b51d1dad4ab0f0a55565d"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.314379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerStarted","Data":"b9a05a577bcae03e8638a1d8824e6c482913f817aa932b7be5a798f915ae4b0e"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.319086 4954 generic.go:334] "Generic (PLEG): container finished" podID="ef6e4834-a3ef-4c61-b228-5a23a439395f" containerID="8fa0e5e7061e3f7189507a3734baf59c9aeed9d2444d096d53b419a2213bd139" exitCode=0 Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.319151 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" event={"ID":"ef6e4834-a3ef-4c61-b228-5a23a439395f","Type":"ContainerDied","Data":"8fa0e5e7061e3f7189507a3734baf59c9aeed9d2444d096d53b419a2213bd139"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.319682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" event={"ID":"ef6e4834-a3ef-4c61-b228-5a23a439395f","Type":"ContainerStarted","Data":"e7d5dae02d6bb552eb113d57475687667b392ff037d18327d74b3265a90466e5"} Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.393278 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-zndqf"] Nov 27 17:43:05 crc kubenswrapper[4954]: I1127 17:43:05.402108 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pt6gj/crc-debug-zndqf"] Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.343303 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerStarted","Data":"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299"} Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.451269 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.585267 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795n4\" (UniqueName: \"kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4\") pod \"ef6e4834-a3ef-4c61-b228-5a23a439395f\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.585741 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host\") pod \"ef6e4834-a3ef-4c61-b228-5a23a439395f\" (UID: \"ef6e4834-a3ef-4c61-b228-5a23a439395f\") " Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.585913 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host" (OuterVolumeSpecName: "host") pod "ef6e4834-a3ef-4c61-b228-5a23a439395f" (UID: "ef6e4834-a3ef-4c61-b228-5a23a439395f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.586449 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6e4834-a3ef-4c61-b228-5a23a439395f-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.591182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4" (OuterVolumeSpecName: "kube-api-access-795n4") pod "ef6e4834-a3ef-4c61-b228-5a23a439395f" (UID: "ef6e4834-a3ef-4c61-b228-5a23a439395f"). InnerVolumeSpecName "kube-api-access-795n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.673646 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6e4834-a3ef-4c61-b228-5a23a439395f" path="/var/lib/kubelet/pods/ef6e4834-a3ef-4c61-b228-5a23a439395f/volumes" Nov 27 17:43:06 crc kubenswrapper[4954]: I1127 17:43:06.687809 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795n4\" (UniqueName: \"kubernetes.io/projected/ef6e4834-a3ef-4c61-b228-5a23a439395f-kube-api-access-795n4\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:07 crc kubenswrapper[4954]: I1127 17:43:07.355014 4954 generic.go:334] "Generic (PLEG): container finished" podID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerID="faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299" exitCode=0 Nov 27 17:43:07 crc kubenswrapper[4954]: I1127 17:43:07.355334 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerDied","Data":"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299"} Nov 27 17:43:07 crc kubenswrapper[4954]: I1127 17:43:07.358328 4954 scope.go:117] "RemoveContainer" containerID="8fa0e5e7061e3f7189507a3734baf59c9aeed9d2444d096d53b419a2213bd139" Nov 27 17:43:07 crc kubenswrapper[4954]: I1127 17:43:07.358396 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/crc-debug-zndqf" Nov 27 17:43:08 crc kubenswrapper[4954]: I1127 17:43:08.368351 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerID="c0c06fd00c2d00744147d749dea9e48861c611e4b12d8ceab3933e981784305b" exitCode=0 Nov 27 17:43:08 crc kubenswrapper[4954]: I1127 17:43:08.368446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerDied","Data":"c0c06fd00c2d00744147d749dea9e48861c611e4b12d8ceab3933e981784305b"} Nov 27 17:43:09 crc kubenswrapper[4954]: I1127 17:43:09.383684 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerStarted","Data":"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f"} Nov 27 17:43:09 crc kubenswrapper[4954]: I1127 17:43:09.386567 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerStarted","Data":"7be58a4c411c92b97a09556d09e83325800b64f3460f60179dde0491afdb06cf"} Nov 27 17:43:09 crc kubenswrapper[4954]: I1127 17:43:09.407179 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fzxf9" podStartSLOduration=3.671561327 podStartE2EDuration="6.407158561s" podCreationTimestamp="2025-11-27 17:43:03 +0000 UTC" firstStartedPulling="2025-11-27 17:43:05.311488628 +0000 UTC m=+3897.328928958" lastFinishedPulling="2025-11-27 17:43:08.047085892 +0000 UTC m=+3900.064526192" observedRunningTime="2025-11-27 17:43:09.400676543 +0000 UTC m=+3901.418116853" watchObservedRunningTime="2025-11-27 17:43:09.407158561 +0000 UTC m=+3901.424598861" Nov 27 17:43:09 crc kubenswrapper[4954]: I1127 17:43:09.419500 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjjh5" podStartSLOduration=2.637910756 podStartE2EDuration="6.41948354s" podCreationTimestamp="2025-11-27 17:43:03 +0000 UTC" firstStartedPulling="2025-11-27 17:43:05.315560147 +0000 UTC m=+3897.333000447" lastFinishedPulling="2025-11-27 17:43:09.097132931 +0000 UTC m=+3901.114573231" observedRunningTime="2025-11-27 17:43:09.417080921 +0000 UTC m=+3901.434521221" watchObservedRunningTime="2025-11-27 17:43:09.41948354 +0000 UTC m=+3901.436923840" Nov 27 17:43:13 crc kubenswrapper[4954]: I1127 17:43:13.662486 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:43:13 crc kubenswrapper[4954]: E1127 17:43:13.663313 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:43:14 crc kubenswrapper[4954]: I1127 17:43:14.118023 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:14 crc kubenswrapper[4954]: I1127 17:43:14.118088 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:14 crc kubenswrapper[4954]: I1127 17:43:14.309367 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:14 crc kubenswrapper[4954]: I1127 17:43:14.309423 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:15 crc kubenswrapper[4954]: I1127 17:43:15.173361 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fzxf9" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="registry-server" probeResult="failure" output=< Nov 27 17:43:15 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Nov 27 17:43:15 crc kubenswrapper[4954]: > Nov 27 17:43:15 crc kubenswrapper[4954]: I1127 17:43:15.352447 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jjjh5" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="registry-server" probeResult="failure" output=< Nov 27 17:43:15 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Nov 27 17:43:15 crc kubenswrapper[4954]: > Nov 27 17:43:22 crc kubenswrapper[4954]: I1127 17:43:22.810810 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c4fd9778-zrzw7_3e0b062d-ff7b-4acc-8857-f463ec1bc195/barbican-api/0.log" Nov 27 17:43:22 crc kubenswrapper[4954]: I1127 17:43:22.925928 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c4fd9778-zrzw7_3e0b062d-ff7b-4acc-8857-f463ec1bc195/barbican-api-log/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.053465 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f5f6896-mswxw_e09487f3-5539-4df4-8b9b-6da0b0b741de/barbican-keystone-listener/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.135174 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f5f6896-mswxw_e09487f3-5539-4df4-8b9b-6da0b0b741de/barbican-keystone-listener-log/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.230999 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bd6cd4c89-x6dht_dc83f9b6-fbea-4463-8127-08590404f021/barbican-worker/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.293345 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bd6cd4c89-x6dht_dc83f9b6-fbea-4463-8127-08590404f021/barbican-worker-log/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.355890 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns_98ad0395-6bb9-46b3-a81b-3f4b1c2dad04/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.563805 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/ceilometer-notification-agent/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.600499 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/ceilometer-central-agent/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.617447 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/proxy-httpd/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.707254 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/sg-core/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.814801 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d6609b2-5156-4d39-b4fd-05cb39b98915/cinder-api-log/0.log" Nov 27 17:43:23 crc kubenswrapper[4954]: I1127 17:43:23.848590 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d6609b2-5156-4d39-b4fd-05cb39b98915/cinder-api/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.020285 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fa6f325f-3f75-4d35-9ffa-3298dc1a936e/cinder-scheduler/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.044547 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fa6f325f-3f75-4d35-9ffa-3298dc1a936e/probe/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.163837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.227706 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.243008 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-862k7_745fc0e0-ebc3-4a97-8858-148da2dbb20d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.296949 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp_5e3f28f3-6e95-438e-ba6e-587578b29bf9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.355547 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.405734 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.420386 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/init/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.675967 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/init/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.692294 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k_59b766b5-12a6-4e9c-b627-3d7705a04afc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.710890 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/dnsmasq-dns/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.940965 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1301cc13-44b9-4a6e-b82d-cbea335ebc9a/glance-httpd/0.log" Nov 27 17:43:24 crc kubenswrapper[4954]: I1127 17:43:24.949842 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1301cc13-44b9-4a6e-b82d-cbea335ebc9a/glance-log/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.128985 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07/glance-log/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.190321 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07/glance-httpd/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.311239 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b5c6d8894-l7bzv_11ddebaa-610a-410a-a161-a5b89d87eb75/horizon/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.437328 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq_dbb8e909-5f3f-4076-b549-d489f37cd8e3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.650389 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b5c6d8894-l7bzv_11ddebaa-610a-410a-a161-a5b89d87eb75/horizon-log/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.662939 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:43:25 crc kubenswrapper[4954]: E1127 17:43:25.663211 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.712463 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5sdh6_d7832bff-0ac7-4654-8277-92b9d5c04aa0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.967465 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-68775c76df-2ppbs_a541738e-915f-413b-9b84-d57553ebc170/keystone-api/0.log" Nov 27 17:43:25 crc kubenswrapper[4954]: I1127 17:43:25.998069 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404381-h5mc2_fa450761-82d0-4005-aee7-bcb56c03a5fd/keystone-cron/0.log" Nov 27 17:43:26 crc kubenswrapper[4954]: I1127 17:43:26.102403 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1ba0b816-c965-4474-b923-73f572cdc1ab/kube-state-metrics/0.log" Nov 27 17:43:26 crc kubenswrapper[4954]: I1127 17:43:26.261995 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-flwcw_6d34dbe8-0864-4b92-bd50-5bdd57209a74/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:26 crc kubenswrapper[4954]: I1127 17:43:26.716188 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f6cfb75df-7gbdb_3c0fe668-ab8d-4bad-acdd-da6d230de548/neutron-api/0.log" Nov 27 17:43:26 crc kubenswrapper[4954]: I1127 17:43:26.761938 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f6cfb75df-7gbdb_3c0fe668-ab8d-4bad-acdd-da6d230de548/neutron-httpd/0.log" Nov 27 17:43:26 crc kubenswrapper[4954]: I1127 17:43:26.793168 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp_5ea501ba-5c0c-4392-a64b-695c832dbb89/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.400197 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6662243a-d2bd-4571-8e27-6b923a367942/nova-api-log/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.435688 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b181e3d6-4f0e-40f1-ac14-96bcbb17622a/nova-cell0-conductor-conductor/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.640373 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_025a86e8-034b-4eef-8f20-14141598f0b4/nova-cell1-conductor-conductor/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.660623 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6662243a-d2bd-4571-8e27-6b923a367942/nova-api-api/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.722667 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a8c31305-69a0-477a-958f-d91daa9fe501/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.898547 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:27 crc kubenswrapper[4954]: I1127 17:43:27.899189 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fzxf9" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="registry-server" containerID="cri-o://10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f" gracePeriod=2 Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.030494 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hq64s_7ab77d00-245a-41d2-a223-1caff56f23da/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.106346 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.106608 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjjh5" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="registry-server" containerID="cri-o://7be58a4c411c92b97a09556d09e83325800b64f3460f60179dde0491afdb06cf" gracePeriod=2 Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.107906 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_51632054-40fc-42a7-b633-e1e35143689f/nova-metadata-log/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.461553 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.473365 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/mysql-bootstrap/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.521131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content\") pod \"3a462761-cdc5-4050-9e5c-586e5233f7e1\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.521400 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crmn4\" (UniqueName: \"kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4\") pod \"3a462761-cdc5-4050-9e5c-586e5233f7e1\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.521474 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities\") pod \"3a462761-cdc5-4050-9e5c-586e5233f7e1\" (UID: \"3a462761-cdc5-4050-9e5c-586e5233f7e1\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.522486 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities" (OuterVolumeSpecName: "utilities") pod "3a462761-cdc5-4050-9e5c-586e5233f7e1" (UID: "3a462761-cdc5-4050-9e5c-586e5233f7e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.535891 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4" (OuterVolumeSpecName: "kube-api-access-crmn4") pod "3a462761-cdc5-4050-9e5c-586e5233f7e1" (UID: "3a462761-cdc5-4050-9e5c-586e5233f7e1"). InnerVolumeSpecName "kube-api-access-crmn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.566905 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a462761-cdc5-4050-9e5c-586e5233f7e1" (UID: "3a462761-cdc5-4050-9e5c-586e5233f7e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.575556 4954 generic.go:334] "Generic (PLEG): container finished" podID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerID="7be58a4c411c92b97a09556d09e83325800b64f3460f60179dde0491afdb06cf" exitCode=0 Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.575635 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerDied","Data":"7be58a4c411c92b97a09556d09e83325800b64f3460f60179dde0491afdb06cf"} Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.593567 4954 generic.go:334] "Generic (PLEG): container finished" podID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerID="10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f" exitCode=0 Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.593833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerDied","Data":"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f"} Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.593897 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzxf9" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.593902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzxf9" event={"ID":"3a462761-cdc5-4050-9e5c-586e5233f7e1","Type":"ContainerDied","Data":"9a3d809c889ec2769cabe2884eb5acb3bad410a199c9bee779f593f3d00cbf31"} Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.593928 4954 scope.go:117] "RemoveContainer" containerID="10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.618548 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.623984 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crmn4\" (UniqueName: \"kubernetes.io/projected/3a462761-cdc5-4050-9e5c-586e5233f7e1-kube-api-access-crmn4\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.624017 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.624027 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a462761-cdc5-4050-9e5c-586e5233f7e1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.632103 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_99c33ed6-9c2c-4eb0-be67-68c19d5479a7/nova-scheduler-scheduler/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.640993 4954 scope.go:117] "RemoveContainer" containerID="faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.692646 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.695417 4954 scope.go:117] "RemoveContainer" containerID="b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.716627 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/mysql-bootstrap/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.727397 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities\") pod \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.727600 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctqp2\" (UniqueName: \"kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2\") pod \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.727608 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fzxf9"] Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.727752 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content\") pod \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\" (UID: \"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017\") " Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.728151 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities" (OuterVolumeSpecName: "utilities") pod "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" (UID: "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.735747 4954 scope.go:117] "RemoveContainer" containerID="10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.735781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2" (OuterVolumeSpecName: "kube-api-access-ctqp2") pod "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" (UID: "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017"). InnerVolumeSpecName "kube-api-access-ctqp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: E1127 17:43:28.746903 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f\": container with ID starting with 10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f not found: ID does not exist" containerID="10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.746955 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f"} err="failed to get container status \"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f\": rpc error: code = NotFound desc = could not find container \"10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f\": container with ID starting with 10bc5022436deed16965ab427ff160a15f0c441fdc9ce9e2fbe37d8609f6216f not found: ID does not exist" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.746984 4954 scope.go:117] "RemoveContainer" containerID="faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299" Nov 27 17:43:28 crc kubenswrapper[4954]: E1127 17:43:28.747415 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299\": container with ID starting with faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299 not found: ID does not exist" containerID="faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.747441 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299"} err="failed to get container status \"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299\": rpc error: code = NotFound desc = could not find container \"faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299\": container with ID starting with faf3ba6338e9d6f86b2c46fb462049c2bafd1588960e898560df93e8ba8cc299 not found: ID does not exist" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.747456 4954 scope.go:117] "RemoveContainer" containerID="b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68" Nov 27 17:43:28 crc kubenswrapper[4954]: E1127 17:43:28.747862 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68\": container with ID starting with b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68 not found: ID does not exist" containerID="b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.747894 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68"} err="failed to get container status \"b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68\": rpc error: code = NotFound desc = could not find container \"b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68\": container with ID starting with b2b3c81303eb7162bc2a549b492c7d2eeee4c9ac4c34cf776a535ee6ad748d68 not found: ID does not exist" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.820859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" (UID: "8ca2d7a5-ab0c-4a4e-ad9d-e856be549017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.824175 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/galera/0.log" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.831641 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctqp2\" (UniqueName: \"kubernetes.io/projected/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-kube-api-access-ctqp2\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.831673 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.831683 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:43:28 crc kubenswrapper[4954]: I1127 17:43:28.973975 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/mysql-bootstrap/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.183608 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/galera/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.208971 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/mysql-bootstrap/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.425151 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3/openstackclient/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.477461 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wnm29_49863f24-1603-49e2-835c-31ced01d9f7f/openstack-network-exporter/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.609566 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjjh5" event={"ID":"8ca2d7a5-ab0c-4a4e-ad9d-e856be549017","Type":"ContainerDied","Data":"b9a05a577bcae03e8638a1d8824e6c482913f817aa932b7be5a798f915ae4b0e"} Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.609648 4954 scope.go:117] "RemoveContainer" containerID="7be58a4c411c92b97a09556d09e83325800b64f3460f60179dde0491afdb06cf" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.609667 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjjh5" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.646053 4954 scope.go:117] "RemoveContainer" containerID="c0c06fd00c2d00744147d749dea9e48861c611e4b12d8ceab3933e981784305b" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.656502 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.666233 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjjh5"] Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.690190 4954 scope.go:117] "RemoveContainer" containerID="33d4857c952ea09911545667b78d27170bd36d594c5b51d1dad4ab0f0a55565d" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.744261 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_51632054-40fc-42a7-b633-e1e35143689f/nova-metadata-metadata/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.758816 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server-init/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.878290 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server-init/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.902697 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovs-vswitchd/0.log" Nov 27 17:43:29 crc kubenswrapper[4954]: I1127 17:43:29.913145 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.138418 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wzf94_3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.170167 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7sc8_2a98905f-a2dd-4eb2-9a4f-437eb3626871/ovn-controller/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.351036 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d11a38a9-30c1-44d2-81ca-965f0dfbde96/openstack-network-exporter/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.383891 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d11a38a9-30c1-44d2-81ca-965f0dfbde96/ovn-northd/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.496221 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_585214bf-1a7b-426d-b1a6-d26e69e0116f/openstack-network-exporter/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.568181 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_585214bf-1a7b-426d-b1a6-d26e69e0116f/ovsdbserver-nb/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.672852 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f8f5ac1-9978-4d8a-b12d-f902e9cb316c/openstack-network-exporter/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.679415 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" path="/var/lib/kubelet/pods/3a462761-cdc5-4050-9e5c-586e5233f7e1/volumes" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.680228 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" path="/var/lib/kubelet/pods/8ca2d7a5-ab0c-4a4e-ad9d-e856be549017/volumes" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.794097 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f8f5ac1-9978-4d8a-b12d-f902e9cb316c/ovsdbserver-sb/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.910019 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d8d4694bd-z9zk4_a4617263-6b9f-4f0c-af69-9d589143eb12/placement-api/0.log" Nov 27 17:43:30 crc kubenswrapper[4954]: I1127 17:43:30.995787 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d8d4694bd-z9zk4_a4617263-6b9f-4f0c-af69-9d589143eb12/placement-log/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.101473 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/setup-container/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.272801 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/setup-container/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.281658 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/rabbitmq/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.335045 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/setup-container/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.681040 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/setup-container/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.733860 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf_6e2def23-1765-4015-b698-c2b8516a6f18/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.744857 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/rabbitmq/0.log" Nov 27 17:43:31 crc kubenswrapper[4954]: I1127 17:43:31.911887 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qvnb8_39bece64-6033-4ca3-846d-6718f68f1f6d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.007938 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6_d294865e-7999-4e81-818f-3a5db24b7f01/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.245452 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6l884_cfb3cf23-1ad0-47ac-af59-8b8ae7e79678/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.271563 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xgjv5_694335d5-113f-4c2b-ab58-22fc7b866e46/ssh-known-hosts-edpm-deployment/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.576015 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85cf58799f-l72lc_9de053dc-d10c-4999-9019-f7221fb9e237/proxy-server/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.677492 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85cf58799f-l72lc_9de053dc-d10c-4999-9019-f7221fb9e237/proxy-httpd/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.704651 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cbrqw_794c6bdd-2ec7-458f-99ed-23383a740479/swift-ring-rebalance/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.771197 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-auditor/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.899071 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-reaper/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.926798 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-replicator/0.log" Nov 27 17:43:32 crc kubenswrapper[4954]: I1127 17:43:32.996213 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-server/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.115479 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-auditor/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.152235 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-server/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.171090 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-replicator/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.202909 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-updater/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.326733 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-expirer/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.417523 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-server/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.452546 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-auditor/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.484042 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-replicator/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.576070 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-updater/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.639423 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/rsync/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.661437 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/swift-recon-cron/0.log" Nov 27 17:43:33 crc kubenswrapper[4954]: I1127 17:43:33.948344 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-r476v_200fb5dd-f5ad-4f82-8a9c-e8e378075448/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:34 crc kubenswrapper[4954]: I1127 17:43:34.018785 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ae22fda2-42ce-4b9e-9daf-00e886b8449b/tempest-tests-tempest-tests-runner/0.log" Nov 27 17:43:34 crc kubenswrapper[4954]: I1127 17:43:34.148724 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c/test-operator-logs-container/0.log" Nov 27 17:43:34 crc kubenswrapper[4954]: I1127 17:43:34.213808 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b66qm_655b8641-7aaf-4f45-b8a0-b23fbbfa3abd/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:43:39 crc kubenswrapper[4954]: I1127 17:43:39.662083 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:43:39 crc kubenswrapper[4954]: E1127 17:43:39.663006 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:43:43 crc kubenswrapper[4954]: I1127 17:43:43.577036 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_808630a2-42dd-48c9-a004-749515cb771b/memcached/0.log" Nov 27 17:43:44 crc kubenswrapper[4954]: I1127 17:43:44.052736 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" podUID="736ef0f4-e471-4acd-8569-2a6d6d260f67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.76:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:43:44 crc kubenswrapper[4954]: I1127 17:43:44.052792 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" podUID="67df4fc6-9215-4441-955b-d7d740c5db1e" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.57:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:43:44 crc kubenswrapper[4954]: I1127 17:43:44.052857 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-57548d458d-4vpsc" podUID="736ef0f4-e471-4acd-8569-2a6d6d260f67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.76:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:43:44 crc kubenswrapper[4954]: I1127 17:43:44.053537 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-operator-757f5977c4-9sxch" podUID="67df4fc6-9215-4441-955b-d7d740c5db1e" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.57:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:43:54 crc kubenswrapper[4954]: I1127 17:43:54.665820 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:43:54 crc kubenswrapper[4954]: E1127 17:43:54.666755 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:44:03 crc kubenswrapper[4954]: I1127 17:44:03.811485 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.003117 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.003297 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.050137 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.190598 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.210989 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/extract/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.234424 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.399059 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nz28b_50ec526e-d6db-45fa-8b99-bd795b4c3690/kube-rbac-proxy/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.434000 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nz28b_50ec526e-d6db-45fa-8b99-bd795b4c3690/manager/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.608049 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-4rg5t_11ca1308-8c7a-4a3d-a283-2533abc54c25/kube-rbac-proxy/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.638698 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-4rg5t_11ca1308-8c7a-4a3d-a283-2533abc54c25/manager/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.732086 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-dzjch_a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95/kube-rbac-proxy/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.829202 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-dzjch_a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95/manager/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.885449 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-8ghg2_7c8dd8cc-7be7-41f9-ac93-139dc9e83274/kube-rbac-proxy/0.log" Nov 27 17:44:04 crc kubenswrapper[4954]: I1127 17:44:04.998711 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-8ghg2_7c8dd8cc-7be7-41f9-ac93-139dc9e83274/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.080860 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zlt7m_5884eab6-e3c0-45de-b93d-73392533b780/kube-rbac-proxy/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.151327 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zlt7m_5884eab6-e3c0-45de-b93d-73392533b780/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.241429 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-nnj6l_c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a/kube-rbac-proxy/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.324821 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-nnj6l_c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.379684 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4vpsc_736ef0f4-e471-4acd-8569-2a6d6d260f67/kube-rbac-proxy/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.515027 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-42dmk_9366c02a-e022-47e4-86c2-35d1e9a54cf4/kube-rbac-proxy/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.586352 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-42dmk_9366c02a-e022-47e4-86c2-35d1e9a54cf4/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.612709 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4vpsc_736ef0f4-e471-4acd-8569-2a6d6d260f67/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.755721 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-bw2j9_cc869191-7d3d-4192-bf48-a48625bff6ff/kube-rbac-proxy/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.789595 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-bw2j9_cc869191-7d3d-4192-bf48-a48625bff6ff/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.978187 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-2jpwm_6dbcc715-b375-4776-87ff-4c5ecad80975/manager/0.log" Nov 27 17:44:05 crc kubenswrapper[4954]: I1127 17:44:05.980939 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-2jpwm_6dbcc715-b375-4776-87ff-4c5ecad80975/kube-rbac-proxy/0.log" Nov 27 17:44:06 crc kubenswrapper[4954]: I1127 17:44:06.123979 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5vqr2_ff3108ae-4629-448b-80d3-949e631c60d8/kube-rbac-proxy/0.log" Nov 27 17:44:06 crc kubenswrapper[4954]: I1127 17:44:06.290303 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5vqr2_ff3108ae-4629-448b-80d3-949e631c60d8/manager/0.log" Nov 27 17:44:06 crc kubenswrapper[4954]: I1127 17:44:06.293979 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4g8kb_56f35029-dbcb-437a-94ed-3eac63c5145c/kube-rbac-proxy/0.log" Nov 27 17:44:06 crc kubenswrapper[4954]: I1127 17:44:06.396748 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4g8kb_56f35029-dbcb-437a-94ed-3eac63c5145c/manager/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.276102 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-p55vw_b4fb4c16-8870-494e-a075-ee70d251da46/kube-rbac-proxy/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.379605 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-p55vw_b4fb4c16-8870-494e-a075-ee70d251da46/manager/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.466055 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-9pwxb_770db406-d44c-490f-8409-f5b3e8f66145/kube-rbac-proxy/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.467293 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-9pwxb_770db406-d44c-490f-8409-f5b3e8f66145/manager/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.541966 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb_cbeef148-5a6f-4738-83f0-eae93d81bae3/kube-rbac-proxy/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.636473 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb_cbeef148-5a6f-4738-83f0-eae93d81bae3/manager/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.955232 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xhxgn_e300d0a8-a678-4065-bbcd-a886791e9e1a/registry-server/0.log" Nov 27 17:44:07 crc kubenswrapper[4954]: I1127 17:44:07.961458 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-757f5977c4-9sxch_67df4fc6-9215-4441-955b-d7d740c5db1e/operator/0.log" Nov 27 17:44:08 crc kubenswrapper[4954]: I1127 17:44:08.067128 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-mln9c_7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0/kube-rbac-proxy/0.log" Nov 27 17:44:08 crc kubenswrapper[4954]: I1127 17:44:08.681810 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:44:08 crc kubenswrapper[4954]: E1127 17:44:08.683062 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:44:08 crc kubenswrapper[4954]: I1127 17:44:08.791077 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-nv8bz_73b53349-7e1d-499f-918e-e25598787e70/manager/0.log" Nov 27 17:44:08 crc kubenswrapper[4954]: I1127 17:44:08.829867 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-nv8bz_73b53349-7e1d-499f-918e-e25598787e70/kube-rbac-proxy/0.log" Nov 27 17:44:08 crc kubenswrapper[4954]: I1127 17:44:08.839128 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-mln9c_7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.006841 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-556d4f4767-6wqxx_7eefae7c-fef6-47b3-8f89-4856b6ae1980/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.064633 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xvk89_8fad5f5d-c6a2-497f-8524-1ae501d6a444/operator/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.074281 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-mmr72_376db7a5-650f-4327-8e03-2f2be98969a0/kube-rbac-proxy/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.227074 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-mmr72_376db7a5-650f-4327-8e03-2f2be98969a0/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.309275 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wr8t4_523e3a36-bc9e-4698-af7d-e7ecd3b7a740/kube-rbac-proxy/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.329037 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wr8t4_523e3a36-bc9e-4698-af7d-e7ecd3b7a740/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.399820 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7dmz6_146450d6-91cc-4600-9712-449fcf5328b2/kube-rbac-proxy/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.457824 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7dmz6_146450d6-91cc-4600-9712-449fcf5328b2/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.521744 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-wg8x7_6a00b9f9-d61f-411d-897d-496d8c8b3501/manager/0.log" Nov 27 17:44:09 crc kubenswrapper[4954]: I1127 17:44:09.526260 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-wg8x7_6a00b9f9-d61f-411d-897d-496d8c8b3501/kube-rbac-proxy/0.log" Nov 27 17:44:20 crc kubenswrapper[4954]: I1127 17:44:20.662753 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:44:20 crc kubenswrapper[4954]: E1127 17:44:20.663530 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:44:28 crc kubenswrapper[4954]: I1127 17:44:28.961699 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zmv7j_4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb/control-plane-machine-set-operator/0.log" Nov 27 17:44:29 crc kubenswrapper[4954]: I1127 17:44:29.081957 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h48pg_daf9759f-1f7d-4613-b734-a39f4552222e/kube-rbac-proxy/0.log" Nov 27 17:44:29 crc kubenswrapper[4954]: I1127 17:44:29.118237 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h48pg_daf9759f-1f7d-4613-b734-a39f4552222e/machine-api-operator/0.log" Nov 27 17:44:33 crc kubenswrapper[4954]: I1127 17:44:33.662565 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:44:33 crc kubenswrapper[4954]: E1127 17:44:33.663485 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:44:40 crc kubenswrapper[4954]: I1127 17:44:40.029941 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2t966_51fb16a6-3c9e-4cca-a603-8b71f0b91ee1/cert-manager-controller/0.log" Nov 27 17:44:40 crc kubenswrapper[4954]: I1127 17:44:40.195548 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-96dn8_f6f24261-8d7e-454f-8d20-2a35f12114c6/cert-manager-cainjector/0.log" Nov 27 17:44:40 crc kubenswrapper[4954]: I1127 17:44:40.207682 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ghjsr_04065317-2688-429e-8362-970a4f083d14/cert-manager-webhook/0.log" Nov 27 17:44:48 crc kubenswrapper[4954]: I1127 17:44:48.668196 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:44:48 crc kubenswrapper[4954]: E1127 17:44:48.669005 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:44:51 crc kubenswrapper[4954]: I1127 17:44:51.866699 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-g5fhs_88437c38-051a-4331-bfd9-1b5356e88818/nmstate-console-plugin/0.log" Nov 27 17:44:52 crc kubenswrapper[4954]: I1127 17:44:52.113350 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4dwz_80ecd4a6-6bf2-4533-ab69-a5a12b747d81/nmstate-handler/0.log" Nov 27 17:44:52 crc kubenswrapper[4954]: I1127 17:44:52.202261 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bkn4s_e9b96f60-bef6-430b-8f44-d5e602d140ee/nmstate-metrics/0.log" Nov 27 17:44:52 crc kubenswrapper[4954]: I1127 17:44:52.212706 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bkn4s_e9b96f60-bef6-430b-8f44-d5e602d140ee/kube-rbac-proxy/0.log" Nov 27 17:44:52 crc kubenswrapper[4954]: I1127 17:44:52.303004 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-qx7sx_742e2266-3aa1-4c59-958e-8200fea0b45c/nmstate-operator/0.log" Nov 27 17:44:52 crc kubenswrapper[4954]: I1127 17:44:52.441473 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-89lrn_391ad61e-fdf4-41bf-b3eb-a8950896debb/nmstate-webhook/0.log" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.278049 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2"] Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279066 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279083 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279097 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="extract-utilities" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279596 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="extract-utilities" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279618 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="extract-content" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279627 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="extract-content" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279649 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="extract-utilities" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279657 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="extract-utilities" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279673 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279680 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279695 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6e4834-a3ef-4c61-b228-5a23a439395f" containerName="container-00" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279702 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6e4834-a3ef-4c61-b228-5a23a439395f" containerName="container-00" Nov 27 17:45:00 crc kubenswrapper[4954]: E1127 17:45:00.279711 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="extract-content" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279718 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="extract-content" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.279966 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca2d7a5-ab0c-4a4e-ad9d-e856be549017" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.280004 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6e4834-a3ef-4c61-b228-5a23a439395f" containerName="container-00" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.280020 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a462761-cdc5-4050-9e5c-586e5233f7e1" containerName="registry-server" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.280793 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.286212 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.286544 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.303118 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2"] Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.436199 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.437411 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzgb\" (UniqueName: \"kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.437573 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.539490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzgb\" (UniqueName: \"kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.539599 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.539703 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.540859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.546178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.557180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzgb\" (UniqueName: \"kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb\") pod \"collect-profiles-29404425-jjrc2\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:00 crc kubenswrapper[4954]: I1127 17:45:00.602485 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:01 crc kubenswrapper[4954]: I1127 17:45:01.097098 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2"] Nov 27 17:45:01 crc kubenswrapper[4954]: I1127 17:45:01.460169 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" event={"ID":"7b47c99d-5acb-4846-9bdb-653c6703a676","Type":"ContainerStarted","Data":"ad5609ac63533b189d4f08f215b0f402a42e7a23c36d2ab4b5c32845fcaf8725"} Nov 27 17:45:01 crc kubenswrapper[4954]: I1127 17:45:01.461288 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" event={"ID":"7b47c99d-5acb-4846-9bdb-653c6703a676","Type":"ContainerStarted","Data":"d83326c505acb918108c7641abbadce6c58f8f9d376e94fc6dba6dbfef2a2b73"} Nov 27 17:45:01 crc kubenswrapper[4954]: I1127 17:45:01.484885 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" podStartSLOduration=1.484864157 podStartE2EDuration="1.484864157s" podCreationTimestamp="2025-11-27 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:45:01.47715986 +0000 UTC m=+4013.494600160" watchObservedRunningTime="2025-11-27 17:45:01.484864157 +0000 UTC m=+4013.502304457" Nov 27 17:45:02 crc kubenswrapper[4954]: I1127 17:45:02.472742 4954 generic.go:334] "Generic (PLEG): container finished" podID="7b47c99d-5acb-4846-9bdb-653c6703a676" containerID="ad5609ac63533b189d4f08f215b0f402a42e7a23c36d2ab4b5c32845fcaf8725" exitCode=0 Nov 27 17:45:02 crc kubenswrapper[4954]: I1127 17:45:02.472781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" event={"ID":"7b47c99d-5acb-4846-9bdb-653c6703a676","Type":"ContainerDied","Data":"ad5609ac63533b189d4f08f215b0f402a42e7a23c36d2ab4b5c32845fcaf8725"} Nov 27 17:45:03 crc kubenswrapper[4954]: I1127 17:45:03.661901 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:45:03 crc kubenswrapper[4954]: E1127 17:45:03.662480 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:45:03 crc kubenswrapper[4954]: I1127 17:45:03.863652 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.016062 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume\") pod \"7b47c99d-5acb-4846-9bdb-653c6703a676\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.016145 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume\") pod \"7b47c99d-5acb-4846-9bdb-653c6703a676\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.016422 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzgb\" (UniqueName: \"kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb\") pod \"7b47c99d-5acb-4846-9bdb-653c6703a676\" (UID: \"7b47c99d-5acb-4846-9bdb-653c6703a676\") " Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.016788 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b47c99d-5acb-4846-9bdb-653c6703a676" (UID: "7b47c99d-5acb-4846-9bdb-653c6703a676"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.017394 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b47c99d-5acb-4846-9bdb-653c6703a676-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.021650 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb" (OuterVolumeSpecName: "kube-api-access-dpzgb") pod "7b47c99d-5acb-4846-9bdb-653c6703a676" (UID: "7b47c99d-5acb-4846-9bdb-653c6703a676"). InnerVolumeSpecName "kube-api-access-dpzgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.022560 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b47c99d-5acb-4846-9bdb-653c6703a676" (UID: "7b47c99d-5acb-4846-9bdb-653c6703a676"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.119201 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b47c99d-5acb-4846-9bdb-653c6703a676-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.119242 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzgb\" (UniqueName: \"kubernetes.io/projected/7b47c99d-5acb-4846-9bdb-653c6703a676-kube-api-access-dpzgb\") on node \"crc\" DevicePath \"\"" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.490976 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" event={"ID":"7b47c99d-5acb-4846-9bdb-653c6703a676","Type":"ContainerDied","Data":"d83326c505acb918108c7641abbadce6c58f8f9d376e94fc6dba6dbfef2a2b73"} Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.491016 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83326c505acb918108c7641abbadce6c58f8f9d376e94fc6dba6dbfef2a2b73" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.491040 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404425-jjrc2" Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.555023 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q"] Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.563125 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-tn87q"] Nov 27 17:45:04 crc kubenswrapper[4954]: I1127 17:45:04.671917 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35048694-881a-428c-b2c8-27e53edd4e5b" path="/var/lib/kubelet/pods/35048694-881a-428c-b2c8-27e53edd4e5b/volumes" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.506644 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-l6mbj_33df22a6-6a0f-445c-8a77-ad9cfb09d3d4/kube-rbac-proxy/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.556104 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-l6mbj_33df22a6-6a0f-445c-8a77-ad9cfb09d3d4/controller/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.707663 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.821888 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.841028 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.855265 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:45:07 crc kubenswrapper[4954]: I1127 17:45:07.930203 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.027494 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.103966 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.104702 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.156406 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.307502 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.327412 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.336032 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/controller/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.359935 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.497466 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/frr-metrics/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.525197 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/kube-rbac-proxy/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.530087 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/kube-rbac-proxy-frr/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.712208 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-f6v24_6792e473-15c3-405b-8c32-007e421b40c6/frr-k8s-webhook-server/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.713637 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/reloader/0.log" Nov 27 17:45:08 crc kubenswrapper[4954]: I1127 17:45:08.941515 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d9ff7464f-4f4jv_52ee24fe-968b-440d-8884-5772e253c8b4/manager/0.log" Nov 27 17:45:09 crc kubenswrapper[4954]: I1127 17:45:09.111154 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c64f4dc9b-sq87v_0b1812ac-de14-42bf-acbf-d6a68650bb93/webhook-server/0.log" Nov 27 17:45:09 crc kubenswrapper[4954]: I1127 17:45:09.180260 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ql5zn_008cad91-d45f-4942-9e82-239acf3fb8ed/kube-rbac-proxy/0.log" Nov 27 17:45:09 crc kubenswrapper[4954]: I1127 17:45:09.801366 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ql5zn_008cad91-d45f-4942-9e82-239acf3fb8ed/speaker/0.log" Nov 27 17:45:09 crc kubenswrapper[4954]: I1127 17:45:09.945872 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/frr/0.log" Nov 27 17:45:17 crc kubenswrapper[4954]: I1127 17:45:17.662851 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:45:17 crc kubenswrapper[4954]: E1127 17:45:17.663626 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:45:20 crc kubenswrapper[4954]: I1127 17:45:20.702166 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:45:20 crc kubenswrapper[4954]: I1127 17:45:20.910969 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:45:20 crc kubenswrapper[4954]: I1127 17:45:20.912831 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:45:20 crc kubenswrapper[4954]: I1127 17:45:20.921256 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.104155 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.120664 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/extract/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.162569 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.289944 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.423121 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.437557 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.437810 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.581304 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.616460 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/extract/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.623787 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.766910 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.936258 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.957497 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:45:21 crc kubenswrapper[4954]: I1127 17:45:21.991682 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.127667 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.148830 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.305824 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.496649 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.504834 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.532388 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.658991 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/registry-server/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.732799 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.755130 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:45:22 crc kubenswrapper[4954]: I1127 17:45:22.949341 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-txfqr_8215930a-947b-45d7-9c4e-9d867d3f234e/marketplace-operator/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.030969 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.311600 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.354260 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.361474 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/registry-server/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.375914 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.505853 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.526302 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.674490 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/registry-server/0.log" Nov 27 17:45:23 crc kubenswrapper[4954]: I1127 17:45:23.716718 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:45:24 crc kubenswrapper[4954]: I1127 17:45:24.439448 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:45:24 crc kubenswrapper[4954]: I1127 17:45:24.441297 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:45:24 crc kubenswrapper[4954]: I1127 17:45:24.454867 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:45:24 crc kubenswrapper[4954]: I1127 17:45:24.596148 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:45:24 crc kubenswrapper[4954]: I1127 17:45:24.629941 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:45:25 crc kubenswrapper[4954]: I1127 17:45:25.085732 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/registry-server/0.log" Nov 27 17:45:30 crc kubenswrapper[4954]: I1127 17:45:30.664473 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:45:30 crc kubenswrapper[4954]: E1127 17:45:30.665340 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:45:42 crc kubenswrapper[4954]: I1127 17:45:42.086845 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zlt7m" podUID="5884eab6-e3c0-45de-b93d-73392533b780" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.74:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 17:45:45 crc kubenswrapper[4954]: I1127 17:45:45.662871 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:45:45 crc kubenswrapper[4954]: E1127 17:45:45.663781 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:45:47 crc kubenswrapper[4954]: I1127 17:45:47.563035 4954 scope.go:117] "RemoveContainer" containerID="83afc27e906573031be7f63761aad1869f0937bdd327f7633071b47865e2aab7" Nov 27 17:45:58 crc kubenswrapper[4954]: I1127 17:45:58.670105 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:45:58 crc kubenswrapper[4954]: E1127 17:45:58.670842 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:46:10 crc kubenswrapper[4954]: I1127 17:46:10.662196 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:46:10 crc kubenswrapper[4954]: E1127 17:46:10.663119 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:46:24 crc kubenswrapper[4954]: I1127 17:46:24.662549 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:46:24 crc kubenswrapper[4954]: E1127 17:46:24.663369 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:46:39 crc kubenswrapper[4954]: I1127 17:46:39.663192 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:46:39 crc kubenswrapper[4954]: E1127 17:46:39.663975 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:46:52 crc kubenswrapper[4954]: I1127 17:46:52.662679 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:46:52 crc kubenswrapper[4954]: E1127 17:46:52.663365 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:47:03 crc kubenswrapper[4954]: I1127 17:47:03.586702 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerID="f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff" exitCode=0 Nov 27 17:47:03 crc kubenswrapper[4954]: I1127 17:47:03.586786 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" event={"ID":"6f492292-1e0a-4fff-b47a-80d1da52652b","Type":"ContainerDied","Data":"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff"} Nov 27 17:47:03 crc kubenswrapper[4954]: I1127 17:47:03.588002 4954 scope.go:117] "RemoveContainer" containerID="f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff" Nov 27 17:47:03 crc kubenswrapper[4954]: I1127 17:47:03.833476 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pt6gj_must-gather-pgxzg_6f492292-1e0a-4fff-b47a-80d1da52652b/gather/0.log" Nov 27 17:47:06 crc kubenswrapper[4954]: I1127 17:47:06.662389 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:47:07 crc kubenswrapper[4954]: I1127 17:47:07.630458 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d"} Nov 27 17:47:11 crc kubenswrapper[4954]: I1127 17:47:11.936418 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pt6gj/must-gather-pgxzg"] Nov 27 17:47:11 crc kubenswrapper[4954]: I1127 17:47:11.937935 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="copy" containerID="cri-o://5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f" gracePeriod=2 Nov 27 17:47:11 crc kubenswrapper[4954]: I1127 17:47:11.949992 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pt6gj/must-gather-pgxzg"] Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.488386 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pt6gj_must-gather-pgxzg_6f492292-1e0a-4fff-b47a-80d1da52652b/copy/0.log" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.489204 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.591433 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output\") pod \"6f492292-1e0a-4fff-b47a-80d1da52652b\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.591779 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tlx4\" (UniqueName: \"kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4\") pod \"6f492292-1e0a-4fff-b47a-80d1da52652b\" (UID: \"6f492292-1e0a-4fff-b47a-80d1da52652b\") " Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.599881 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4" (OuterVolumeSpecName: "kube-api-access-6tlx4") pod "6f492292-1e0a-4fff-b47a-80d1da52652b" (UID: "6f492292-1e0a-4fff-b47a-80d1da52652b"). InnerVolumeSpecName "kube-api-access-6tlx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.686896 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pt6gj_must-gather-pgxzg_6f492292-1e0a-4fff-b47a-80d1da52652b/copy/0.log" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.687292 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerID="5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f" exitCode=143 Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.687337 4954 scope.go:117] "RemoveContainer" containerID="5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.687457 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pt6gj/must-gather-pgxzg" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.695771 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tlx4\" (UniqueName: \"kubernetes.io/projected/6f492292-1e0a-4fff-b47a-80d1da52652b-kube-api-access-6tlx4\") on node \"crc\" DevicePath \"\"" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.706086 4954 scope.go:117] "RemoveContainer" containerID="f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.752784 4954 scope.go:117] "RemoveContainer" containerID="5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f" Nov 27 17:47:12 crc kubenswrapper[4954]: E1127 17:47:12.753250 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f\": container with ID starting with 5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f not found: ID does not exist" containerID="5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.753379 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f"} err="failed to get container status \"5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f\": rpc error: code = NotFound desc = could not find container \"5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f\": container with ID starting with 5deeeb6184b780f3ce1885448095aff6fd6a628db4daa5f6a1e9c0a487bc155f not found: ID does not exist" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.753492 4954 scope.go:117] "RemoveContainer" containerID="f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff" Nov 27 17:47:12 crc kubenswrapper[4954]: E1127 17:47:12.753854 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff\": container with ID starting with f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff not found: ID does not exist" containerID="f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.753890 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff"} err="failed to get container status \"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff\": rpc error: code = NotFound desc = could not find container \"f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff\": container with ID starting with f61c46064fd38a0e878c933dd53dc40ce869cadfd5bf8a6ec423ab05b9fdf5ff not found: ID does not exist" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.766343 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6f492292-1e0a-4fff-b47a-80d1da52652b" (UID: "6f492292-1e0a-4fff-b47a-80d1da52652b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:47:12 crc kubenswrapper[4954]: I1127 17:47:12.797888 4954 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f492292-1e0a-4fff-b47a-80d1da52652b-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 17:47:14 crc kubenswrapper[4954]: I1127 17:47:14.673220 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" path="/var/lib/kubelet/pods/6f492292-1e0a-4fff-b47a-80d1da52652b/volumes" Nov 27 17:49:23 crc kubenswrapper[4954]: I1127 17:49:23.689852 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:49:23 crc kubenswrapper[4954]: I1127 17:49:23.690791 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.852888 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:33 crc kubenswrapper[4954]: E1127 17:49:33.854094 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b47c99d-5acb-4846-9bdb-653c6703a676" containerName="collect-profiles" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854114 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b47c99d-5acb-4846-9bdb-653c6703a676" containerName="collect-profiles" Nov 27 17:49:33 crc kubenswrapper[4954]: E1127 17:49:33.854145 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="gather" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854153 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="gather" Nov 27 17:49:33 crc kubenswrapper[4954]: E1127 17:49:33.854173 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="copy" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854185 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="copy" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854545 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="copy" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854597 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b47c99d-5acb-4846-9bdb-653c6703a676" containerName="collect-profiles" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.854615 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f492292-1e0a-4fff-b47a-80d1da52652b" containerName="gather" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.856433 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.869029 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.965696 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cx8\" (UniqueName: \"kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.966135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:33 crc kubenswrapper[4954]: I1127 17:49:33.966230 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.045828 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.048340 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.055270 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.067778 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.068001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cx8\" (UniqueName: \"kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.068035 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.068850 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.069570 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.090098 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cx8\" (UniqueName: \"kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8\") pod \"redhat-operators-9lfwr\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.169414 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.169518 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skwf8\" (UniqueName: \"kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.169575 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.218867 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.271056 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skwf8\" (UniqueName: \"kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.271128 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.271718 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.271902 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.272220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.294706 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skwf8\" (UniqueName: \"kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8\") pod \"redhat-marketplace-qb25h\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.382627 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.749529 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:34 crc kubenswrapper[4954]: W1127 17:49:34.942843 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9cc38a_bd9a_4f94_b08f_ccbb8a04ad60.slice/crio-84b25668a582acb66f88cf47e7a47bf75e054568cc6f3b51fa8208a3f5f31bbe WatchSource:0}: Error finding container 84b25668a582acb66f88cf47e7a47bf75e054568cc6f3b51fa8208a3f5f31bbe: Status 404 returned error can't find the container with id 84b25668a582acb66f88cf47e7a47bf75e054568cc6f3b51fa8208a3f5f31bbe Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.943036 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.960213 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerStarted","Data":"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273"} Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.960270 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerStarted","Data":"a25e8f60394792e885e2ea2e7a48dfcd4b54bcb2f97ab4532d8d92b9aec63460"} Nov 27 17:49:34 crc kubenswrapper[4954]: I1127 17:49:34.964614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerStarted","Data":"84b25668a582acb66f88cf47e7a47bf75e054568cc6f3b51fa8208a3f5f31bbe"} Nov 27 17:49:35 crc kubenswrapper[4954]: I1127 17:49:35.975825 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerID="7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8" exitCode=0 Nov 27 17:49:35 crc kubenswrapper[4954]: I1127 17:49:35.975902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerDied","Data":"7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8"} Nov 27 17:49:35 crc kubenswrapper[4954]: I1127 17:49:35.977338 4954 generic.go:334] "Generic (PLEG): container finished" podID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerID="9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273" exitCode=0 Nov 27 17:49:35 crc kubenswrapper[4954]: I1127 17:49:35.977367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerDied","Data":"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273"} Nov 27 17:49:35 crc kubenswrapper[4954]: I1127 17:49:35.978684 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:49:36 crc kubenswrapper[4954]: I1127 17:49:36.988199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerStarted","Data":"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f"} Nov 27 17:49:38 crc kubenswrapper[4954]: I1127 17:49:38.000673 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerID="f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f" exitCode=0 Nov 27 17:49:38 crc kubenswrapper[4954]: I1127 17:49:38.000776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerDied","Data":"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f"} Nov 27 17:49:38 crc kubenswrapper[4954]: I1127 17:49:38.004360 4954 generic.go:334] "Generic (PLEG): container finished" podID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerID="5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f" exitCode=0 Nov 27 17:49:38 crc kubenswrapper[4954]: I1127 17:49:38.004396 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerDied","Data":"5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f"} Nov 27 17:49:39 crc kubenswrapper[4954]: I1127 17:49:39.018676 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerStarted","Data":"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61"} Nov 27 17:49:39 crc kubenswrapper[4954]: I1127 17:49:39.023868 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerStarted","Data":"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680"} Nov 27 17:49:39 crc kubenswrapper[4954]: I1127 17:49:39.069874 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb25h" podStartSLOduration=2.579938703 podStartE2EDuration="5.069850488s" podCreationTimestamp="2025-11-27 17:49:34 +0000 UTC" firstStartedPulling="2025-11-27 17:49:35.978381047 +0000 UTC m=+4287.995821347" lastFinishedPulling="2025-11-27 17:49:38.468292832 +0000 UTC m=+4290.485733132" observedRunningTime="2025-11-27 17:49:39.061067814 +0000 UTC m=+4291.078508134" watchObservedRunningTime="2025-11-27 17:49:39.069850488 +0000 UTC m=+4291.087290788" Nov 27 17:49:39 crc kubenswrapper[4954]: I1127 17:49:39.104264 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9lfwr" podStartSLOduration=3.5442502449999997 podStartE2EDuration="6.104248533s" podCreationTimestamp="2025-11-27 17:49:33 +0000 UTC" firstStartedPulling="2025-11-27 17:49:35.981458861 +0000 UTC m=+4287.998899161" lastFinishedPulling="2025-11-27 17:49:38.541457159 +0000 UTC m=+4290.558897449" observedRunningTime="2025-11-27 17:49:39.102672675 +0000 UTC m=+4291.120112975" watchObservedRunningTime="2025-11-27 17:49:39.104248533 +0000 UTC m=+4291.121688833" Nov 27 17:49:44 crc kubenswrapper[4954]: I1127 17:49:44.219689 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:44 crc kubenswrapper[4954]: I1127 17:49:44.220065 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:44 crc kubenswrapper[4954]: I1127 17:49:44.288798 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:44 crc kubenswrapper[4954]: I1127 17:49:44.383765 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:44 crc kubenswrapper[4954]: I1127 17:49:44.383834 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:45 crc kubenswrapper[4954]: I1127 17:49:45.106505 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:45 crc kubenswrapper[4954]: I1127 17:49:45.129569 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:45 crc kubenswrapper[4954]: I1127 17:49:45.158332 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:46 crc kubenswrapper[4954]: I1127 17:49:46.042953 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.096477 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9lfwr" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="registry-server" containerID="cri-o://1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680" gracePeriod=2 Nov 27 17:49:47 crc kubenswrapper[4954]: E1127 17:49:47.285128 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84be2651_e33f_46d6_b4e0_fb26c1223b4f.slice/crio-1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84be2651_e33f_46d6_b4e0_fb26c1223b4f.slice/crio-conmon-1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.437875 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.438438 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb25h" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="registry-server" containerID="cri-o://b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61" gracePeriod=2 Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.707800 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.733295 4954 scope.go:117] "RemoveContainer" containerID="4b2f370bf0be065e984e748dec0e54df9dbaeb912cdbb02c536d764cce3fc0b2" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.836738 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.883917 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cx8\" (UniqueName: \"kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8\") pod \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.884034 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content\") pod \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.884079 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities\") pod \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\" (UID: \"84be2651-e33f-46d6-b4e0-fb26c1223b4f\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.885149 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities" (OuterVolumeSpecName: "utilities") pod "84be2651-e33f-46d6-b4e0-fb26c1223b4f" (UID: "84be2651-e33f-46d6-b4e0-fb26c1223b4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.891902 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8" (OuterVolumeSpecName: "kube-api-access-z7cx8") pod "84be2651-e33f-46d6-b4e0-fb26c1223b4f" (UID: "84be2651-e33f-46d6-b4e0-fb26c1223b4f"). InnerVolumeSpecName "kube-api-access-z7cx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.986042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities\") pod \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.986116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skwf8\" (UniqueName: \"kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8\") pod \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.986310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content\") pod \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\" (UID: \"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60\") " Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.986681 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cx8\" (UniqueName: \"kubernetes.io/projected/84be2651-e33f-46d6-b4e0-fb26c1223b4f-kube-api-access-z7cx8\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.986696 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.987667 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities" (OuterVolumeSpecName: "utilities") pod "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" (UID: "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:49:47 crc kubenswrapper[4954]: I1127 17:49:47.989241 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8" (OuterVolumeSpecName: "kube-api-access-skwf8") pod "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" (UID: "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60"). InnerVolumeSpecName "kube-api-access-skwf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.004798 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" (UID: "5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.089732 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skwf8\" (UniqueName: \"kubernetes.io/projected/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-kube-api-access-skwf8\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.089788 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.089802 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.111500 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerID="b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61" exitCode=0 Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.111622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerDied","Data":"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61"} Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.111685 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb25h" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.111713 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb25h" event={"ID":"5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60","Type":"ContainerDied","Data":"84b25668a582acb66f88cf47e7a47bf75e054568cc6f3b51fa8208a3f5f31bbe"} Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.111742 4954 scope.go:117] "RemoveContainer" containerID="b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.120986 4954 generic.go:334] "Generic (PLEG): container finished" podID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerID="1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680" exitCode=0 Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.121068 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lfwr" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.121093 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerDied","Data":"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680"} Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.121461 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lfwr" event={"ID":"84be2651-e33f-46d6-b4e0-fb26c1223b4f","Type":"ContainerDied","Data":"a25e8f60394792e885e2ea2e7a48dfcd4b54bcb2f97ab4532d8d92b9aec63460"} Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.143102 4954 scope.go:117] "RemoveContainer" containerID="f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.161994 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.171208 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb25h"] Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.172875 4954 scope.go:117] "RemoveContainer" containerID="7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.195105 4954 scope.go:117] "RemoveContainer" containerID="b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.195785 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61\": container with ID starting with b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61 not found: ID does not exist" containerID="b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.195892 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61"} err="failed to get container status \"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61\": rpc error: code = NotFound desc = could not find container \"b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61\": container with ID starting with b051dd32e977b3716abcdd073c448ad82df6d2942f59d54f0b71b55686986e61 not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.195929 4954 scope.go:117] "RemoveContainer" containerID="f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.196621 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f\": container with ID starting with f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f not found: ID does not exist" containerID="f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.196710 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f"} err="failed to get container status \"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f\": rpc error: code = NotFound desc = could not find container \"f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f\": container with ID starting with f6fa708bc5eaaa634d87b6c16268e051d4e93cba5613a1da05eff86f9ed96d1f not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.196763 4954 scope.go:117] "RemoveContainer" containerID="7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.197212 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8\": container with ID starting with 7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8 not found: ID does not exist" containerID="7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.197257 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8"} err="failed to get container status \"7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8\": rpc error: code = NotFound desc = could not find container \"7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8\": container with ID starting with 7b525d5f60d8e18981a1cc7cf833b0def9ea5fcd7594fd525b092c43a67a6bb8 not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.197289 4954 scope.go:117] "RemoveContainer" containerID="1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.226223 4954 scope.go:117] "RemoveContainer" containerID="5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.294138 4954 scope.go:117] "RemoveContainer" containerID="9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.367944 4954 scope.go:117] "RemoveContainer" containerID="1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.368977 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680\": container with ID starting with 1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680 not found: ID does not exist" containerID="1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.369033 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680"} err="failed to get container status \"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680\": rpc error: code = NotFound desc = could not find container \"1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680\": container with ID starting with 1013ca5fa85175129a1262bc6ce90ef7c086dd8ca86f140fba7eebccdba09680 not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.369075 4954 scope.go:117] "RemoveContainer" containerID="5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.369794 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f\": container with ID starting with 5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f not found: ID does not exist" containerID="5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.369848 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f"} err="failed to get container status \"5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f\": rpc error: code = NotFound desc = could not find container \"5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f\": container with ID starting with 5b22eb51a3ab124442b3069a9d68d1ac1bb05c6e36c09adca25b9490fc19243f not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.369891 4954 scope.go:117] "RemoveContainer" containerID="9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273" Nov 27 17:49:48 crc kubenswrapper[4954]: E1127 17:49:48.370375 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273\": container with ID starting with 9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273 not found: ID does not exist" containerID="9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.370427 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273"} err="failed to get container status \"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273\": rpc error: code = NotFound desc = could not find container \"9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273\": container with ID starting with 9135c9ba0062a27b312d704bfe91e6338919347bffdda6af24837ba7e918f273 not found: ID does not exist" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.566501 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84be2651-e33f-46d6-b4e0-fb26c1223b4f" (UID: "84be2651-e33f-46d6-b4e0-fb26c1223b4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.601479 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84be2651-e33f-46d6-b4e0-fb26c1223b4f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.677093 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" path="/var/lib/kubelet/pods/5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60/volumes" Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.752003 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:48 crc kubenswrapper[4954]: I1127 17:49:48.763883 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9lfwr"] Nov 27 17:49:50 crc kubenswrapper[4954]: I1127 17:49:50.671838 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" path="/var/lib/kubelet/pods/84be2651-e33f-46d6-b4e0-fb26c1223b4f/volumes" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.443960 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjkqm/must-gather-mlwx4"] Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.445989 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="extract-utilities" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.446128 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="extract-utilities" Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.446248 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="extract-content" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.446332 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="extract-content" Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.446434 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="extract-content" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.446526 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="extract-content" Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.446630 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.446724 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.446819 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="extract-utilities" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.446903 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="extract-utilities" Nov 27 17:49:53 crc kubenswrapper[4954]: E1127 17:49:53.446997 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.447077 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.447412 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84be2651-e33f-46d6-b4e0-fb26c1223b4f" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.447529 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cc38a-bd9a-4f94-b08f-ccbb8a04ad60" containerName="registry-server" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.448953 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.451596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kjkqm"/"default-dockercfg-cp8mq" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.451605 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kjkqm"/"kube-root-ca.crt" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.452153 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kjkqm"/"openshift-service-ca.crt" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.475337 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjkqm/must-gather-mlwx4"] Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.490270 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.490381 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48d6h\" (UniqueName: \"kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.592367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48d6h\" (UniqueName: \"kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.592508 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.592946 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.611261 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48d6h\" (UniqueName: \"kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h\") pod \"must-gather-mlwx4\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.687874 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.687934 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:49:53 crc kubenswrapper[4954]: I1127 17:49:53.768426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:49:54 crc kubenswrapper[4954]: I1127 17:49:54.229768 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjkqm/must-gather-mlwx4"] Nov 27 17:49:55 crc kubenswrapper[4954]: I1127 17:49:55.193142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" event={"ID":"3d784d93-57ea-4848-bf68-21934d1855e2","Type":"ContainerStarted","Data":"ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833"} Nov 27 17:49:55 crc kubenswrapper[4954]: I1127 17:49:55.193729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" event={"ID":"3d784d93-57ea-4848-bf68-21934d1855e2","Type":"ContainerStarted","Data":"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca"} Nov 27 17:49:55 crc kubenswrapper[4954]: I1127 17:49:55.193743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" event={"ID":"3d784d93-57ea-4848-bf68-21934d1855e2","Type":"ContainerStarted","Data":"ba91207d34e68ae5190e421a94a83bec9489abb8ba577aa046507b5e2ca64337"} Nov 27 17:49:55 crc kubenswrapper[4954]: I1127 17:49:55.213514 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" podStartSLOduration=2.213469287 podStartE2EDuration="2.213469287s" podCreationTimestamp="2025-11-27 17:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:49:55.205437052 +0000 UTC m=+4307.222877352" watchObservedRunningTime="2025-11-27 17:49:55.213469287 +0000 UTC m=+4307.230909607" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.182367 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-q6xsg"] Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.184308 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.284506 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.284573 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqjb\" (UniqueName: \"kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.386851 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqjb\" (UniqueName: \"kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.387378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.387477 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.407963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqjb\" (UniqueName: \"kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb\") pod \"crc-debug-q6xsg\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:58 crc kubenswrapper[4954]: I1127 17:49:58.504319 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:49:59 crc kubenswrapper[4954]: I1127 17:49:59.224279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" event={"ID":"b38bb69a-5c72-464a-9567-1e43da7747c6","Type":"ContainerStarted","Data":"8f425e11e2149a04a3799f7be308fdbbaf99146fd514d5a4268daeb78520f87b"} Nov 27 17:49:59 crc kubenswrapper[4954]: I1127 17:49:59.225613 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" event={"ID":"b38bb69a-5c72-464a-9567-1e43da7747c6","Type":"ContainerStarted","Data":"2062a186d67809a40beec9b7033e74ba9543a9a88f14c1b15c0ce88b131bcb03"} Nov 27 17:49:59 crc kubenswrapper[4954]: I1127 17:49:59.244109 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" podStartSLOduration=1.244090426 podStartE2EDuration="1.244090426s" podCreationTimestamp="2025-11-27 17:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:49:59.237467935 +0000 UTC m=+4311.254908235" watchObservedRunningTime="2025-11-27 17:49:59.244090426 +0000 UTC m=+4311.261530726" Nov 27 17:50:23 crc kubenswrapper[4954]: I1127 17:50:23.687129 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:50:23 crc kubenswrapper[4954]: I1127 17:50:23.688216 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:50:23 crc kubenswrapper[4954]: I1127 17:50:23.688276 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:50:23 crc kubenswrapper[4954]: I1127 17:50:23.689773 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:50:23 crc kubenswrapper[4954]: I1127 17:50:23.689864 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d" gracePeriod=600 Nov 27 17:50:24 crc kubenswrapper[4954]: I1127 17:50:24.461045 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d" exitCode=0 Nov 27 17:50:24 crc kubenswrapper[4954]: I1127 17:50:24.461149 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d"} Nov 27 17:50:24 crc kubenswrapper[4954]: I1127 17:50:24.461637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24"} Nov 27 17:50:24 crc kubenswrapper[4954]: I1127 17:50:24.461661 4954 scope.go:117] "RemoveContainer" containerID="72d14939d267f3263e2283bed9a7423259124d321383864fb9b0e804d14acba8" Nov 27 17:50:32 crc kubenswrapper[4954]: I1127 17:50:32.527713 4954 generic.go:334] "Generic (PLEG): container finished" podID="b38bb69a-5c72-464a-9567-1e43da7747c6" containerID="8f425e11e2149a04a3799f7be308fdbbaf99146fd514d5a4268daeb78520f87b" exitCode=0 Nov 27 17:50:32 crc kubenswrapper[4954]: I1127 17:50:32.527823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" event={"ID":"b38bb69a-5c72-464a-9567-1e43da7747c6","Type":"ContainerDied","Data":"8f425e11e2149a04a3799f7be308fdbbaf99146fd514d5a4268daeb78520f87b"} Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.635472 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.669722 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-q6xsg"] Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.681327 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-q6xsg"] Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.767426 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host\") pod \"b38bb69a-5c72-464a-9567-1e43da7747c6\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.767705 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrqjb\" (UniqueName: \"kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb\") pod \"b38bb69a-5c72-464a-9567-1e43da7747c6\" (UID: \"b38bb69a-5c72-464a-9567-1e43da7747c6\") " Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.769696 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host" (OuterVolumeSpecName: "host") pod "b38bb69a-5c72-464a-9567-1e43da7747c6" (UID: "b38bb69a-5c72-464a-9567-1e43da7747c6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.781440 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb" (OuterVolumeSpecName: "kube-api-access-xrqjb") pod "b38bb69a-5c72-464a-9567-1e43da7747c6" (UID: "b38bb69a-5c72-464a-9567-1e43da7747c6"). InnerVolumeSpecName "kube-api-access-xrqjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.871007 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b38bb69a-5c72-464a-9567-1e43da7747c6-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:33 crc kubenswrapper[4954]: I1127 17:50:33.871037 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrqjb\" (UniqueName: \"kubernetes.io/projected/b38bb69a-5c72-464a-9567-1e43da7747c6-kube-api-access-xrqjb\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.546302 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2062a186d67809a40beec9b7033e74ba9543a9a88f14c1b15c0ce88b131bcb03" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.546625 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-q6xsg" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.672208 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38bb69a-5c72-464a-9567-1e43da7747c6" path="/var/lib/kubelet/pods/b38bb69a-5c72-464a-9567-1e43da7747c6/volumes" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.831077 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-8rh4k"] Nov 27 17:50:34 crc kubenswrapper[4954]: E1127 17:50:34.831506 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38bb69a-5c72-464a-9567-1e43da7747c6" containerName="container-00" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.831525 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38bb69a-5c72-464a-9567-1e43da7747c6" containerName="container-00" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.831729 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38bb69a-5c72-464a-9567-1e43da7747c6" containerName="container-00" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.832384 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.890846 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzs79\" (UniqueName: \"kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.890925 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.992417 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.992618 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:34 crc kubenswrapper[4954]: I1127 17:50:34.992736 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzs79\" (UniqueName: \"kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:35 crc kubenswrapper[4954]: I1127 17:50:35.022663 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzs79\" (UniqueName: \"kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79\") pod \"crc-debug-8rh4k\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:35 crc kubenswrapper[4954]: I1127 17:50:35.152135 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:35 crc kubenswrapper[4954]: I1127 17:50:35.556215 4954 generic.go:334] "Generic (PLEG): container finished" podID="c03e9019-30b6-4f3a-a985-ec6146e69e1d" containerID="00684f77ad1d41daa2e11e0d6ded43543abf50cbee8d11f8eb2c611fc0cf6ef4" exitCode=0 Nov 27 17:50:35 crc kubenswrapper[4954]: I1127 17:50:35.556290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" event={"ID":"c03e9019-30b6-4f3a-a985-ec6146e69e1d","Type":"ContainerDied","Data":"00684f77ad1d41daa2e11e0d6ded43543abf50cbee8d11f8eb2c611fc0cf6ef4"} Nov 27 17:50:35 crc kubenswrapper[4954]: I1127 17:50:35.556540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" event={"ID":"c03e9019-30b6-4f3a-a985-ec6146e69e1d","Type":"ContainerStarted","Data":"8794950e86cdb04f50a3db90f07f4ecceeeaa9ea86781f5a2c7c02f7ca072874"} Nov 27 17:50:36 crc kubenswrapper[4954]: I1127 17:50:36.015057 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-8rh4k"] Nov 27 17:50:36 crc kubenswrapper[4954]: I1127 17:50:36.026848 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-8rh4k"] Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.072880 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.186786 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-t467f"] Nov 27 17:50:37 crc kubenswrapper[4954]: E1127 17:50:37.187185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03e9019-30b6-4f3a-a985-ec6146e69e1d" containerName="container-00" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.187201 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03e9019-30b6-4f3a-a985-ec6146e69e1d" containerName="container-00" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.187384 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03e9019-30b6-4f3a-a985-ec6146e69e1d" containerName="container-00" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.187978 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.231612 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host\") pod \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.231679 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzs79\" (UniqueName: \"kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79\") pod \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\" (UID: \"c03e9019-30b6-4f3a-a985-ec6146e69e1d\") " Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.231783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host" (OuterVolumeSpecName: "host") pod "c03e9019-30b6-4f3a-a985-ec6146e69e1d" (UID: "c03e9019-30b6-4f3a-a985-ec6146e69e1d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.232339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgmq\" (UniqueName: \"kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.232560 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.232739 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03e9019-30b6-4f3a-a985-ec6146e69e1d-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.241522 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79" (OuterVolumeSpecName: "kube-api-access-nzs79") pod "c03e9019-30b6-4f3a-a985-ec6146e69e1d" (UID: "c03e9019-30b6-4f3a-a985-ec6146e69e1d"). InnerVolumeSpecName "kube-api-access-nzs79". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.334154 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgmq\" (UniqueName: \"kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.334301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.334396 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzs79\" (UniqueName: \"kubernetes.io/projected/c03e9019-30b6-4f3a-a985-ec6146e69e1d-kube-api-access-nzs79\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.334458 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.354478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgmq\" (UniqueName: \"kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq\") pod \"crc-debug-t467f\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.505861 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:37 crc kubenswrapper[4954]: W1127 17:50:37.535254 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f18514d_4c36_4a78_a05e_3fdf372733b0.slice/crio-66fc548f2c34e889fd698e763ce33d2ade089a5c6ba3efd416c5c8f1d4db5a87 WatchSource:0}: Error finding container 66fc548f2c34e889fd698e763ce33d2ade089a5c6ba3efd416c5c8f1d4db5a87: Status 404 returned error can't find the container with id 66fc548f2c34e889fd698e763ce33d2ade089a5c6ba3efd416c5c8f1d4db5a87 Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.576801 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-8rh4k" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.576908 4954 scope.go:117] "RemoveContainer" containerID="00684f77ad1d41daa2e11e0d6ded43543abf50cbee8d11f8eb2c611fc0cf6ef4" Nov 27 17:50:37 crc kubenswrapper[4954]: I1127 17:50:37.588188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-t467f" event={"ID":"4f18514d-4c36-4a78-a05e-3fdf372733b0","Type":"ContainerStarted","Data":"66fc548f2c34e889fd698e763ce33d2ade089a5c6ba3efd416c5c8f1d4db5a87"} Nov 27 17:50:38 crc kubenswrapper[4954]: I1127 17:50:38.596547 4954 generic.go:334] "Generic (PLEG): container finished" podID="4f18514d-4c36-4a78-a05e-3fdf372733b0" containerID="958265b08133abe02e6bd2a60612d61c0b82a6b8ac66f768e7aa709909288aa7" exitCode=0 Nov 27 17:50:38 crc kubenswrapper[4954]: I1127 17:50:38.597000 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/crc-debug-t467f" event={"ID":"4f18514d-4c36-4a78-a05e-3fdf372733b0","Type":"ContainerDied","Data":"958265b08133abe02e6bd2a60612d61c0b82a6b8ac66f768e7aa709909288aa7"} Nov 27 17:50:38 crc kubenswrapper[4954]: I1127 17:50:38.675287 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03e9019-30b6-4f3a-a985-ec6146e69e1d" path="/var/lib/kubelet/pods/c03e9019-30b6-4f3a-a985-ec6146e69e1d/volumes" Nov 27 17:50:38 crc kubenswrapper[4954]: I1127 17:50:38.675891 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-t467f"] Nov 27 17:50:38 crc kubenswrapper[4954]: I1127 17:50:38.679812 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjkqm/crc-debug-t467f"] Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.704561 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.881988 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host\") pod \"4f18514d-4c36-4a78-a05e-3fdf372733b0\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.882134 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host" (OuterVolumeSpecName: "host") pod "4f18514d-4c36-4a78-a05e-3fdf372733b0" (UID: "4f18514d-4c36-4a78-a05e-3fdf372733b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.882808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsgmq\" (UniqueName: \"kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq\") pod \"4f18514d-4c36-4a78-a05e-3fdf372733b0\" (UID: \"4f18514d-4c36-4a78-a05e-3fdf372733b0\") " Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.883382 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f18514d-4c36-4a78-a05e-3fdf372733b0-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.888371 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq" (OuterVolumeSpecName: "kube-api-access-gsgmq") pod "4f18514d-4c36-4a78-a05e-3fdf372733b0" (UID: "4f18514d-4c36-4a78-a05e-3fdf372733b0"). InnerVolumeSpecName "kube-api-access-gsgmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:50:39 crc kubenswrapper[4954]: I1127 17:50:39.985046 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsgmq\" (UniqueName: \"kubernetes.io/projected/4f18514d-4c36-4a78-a05e-3fdf372733b0-kube-api-access-gsgmq\") on node \"crc\" DevicePath \"\"" Nov 27 17:50:40 crc kubenswrapper[4954]: I1127 17:50:40.618796 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fc548f2c34e889fd698e763ce33d2ade089a5c6ba3efd416c5c8f1d4db5a87" Nov 27 17:50:40 crc kubenswrapper[4954]: I1127 17:50:40.618859 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/crc-debug-t467f" Nov 27 17:50:40 crc kubenswrapper[4954]: I1127 17:50:40.674206 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f18514d-4c36-4a78-a05e-3fdf372733b0" path="/var/lib/kubelet/pods/4f18514d-4c36-4a78-a05e-3fdf372733b0/volumes" Nov 27 17:51:01 crc kubenswrapper[4954]: I1127 17:51:01.638838 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c4fd9778-zrzw7_3e0b062d-ff7b-4acc-8857-f463ec1bc195/barbican-api/0.log" Nov 27 17:51:01 crc kubenswrapper[4954]: I1127 17:51:01.642590 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c4fd9778-zrzw7_3e0b062d-ff7b-4acc-8857-f463ec1bc195/barbican-api-log/0.log" Nov 27 17:51:01 crc kubenswrapper[4954]: I1127 17:51:01.780777 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f5f6896-mswxw_e09487f3-5539-4df4-8b9b-6da0b0b741de/barbican-keystone-listener/0.log" Nov 27 17:51:01 crc kubenswrapper[4954]: I1127 17:51:01.861755 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f5f6896-mswxw_e09487f3-5539-4df4-8b9b-6da0b0b741de/barbican-keystone-listener-log/0.log" Nov 27 17:51:01 crc kubenswrapper[4954]: I1127 17:51:01.879628 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bd6cd4c89-x6dht_dc83f9b6-fbea-4463-8127-08590404f021/barbican-worker/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.007006 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bd6cd4c89-x6dht_dc83f9b6-fbea-4463-8127-08590404f021/barbican-worker-log/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.115380 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bz8ns_98ad0395-6bb9-46b3-a81b-3f4b1c2dad04/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.268419 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/ceilometer-central-agent/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.289942 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/ceilometer-notification-agent/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.329837 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/proxy-httpd/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.441759 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61397bb6-588c-4c10-bd06-c7010f737605/sg-core/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.521382 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d6609b2-5156-4d39-b4fd-05cb39b98915/cinder-api/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.565827 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d6609b2-5156-4d39-b4fd-05cb39b98915/cinder-api-log/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.763642 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fa6f325f-3f75-4d35-9ffa-3298dc1a936e/probe/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.790131 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fa6f325f-3f75-4d35-9ffa-3298dc1a936e/cinder-scheduler/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.883229 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-862k7_745fc0e0-ebc3-4a97-8858-148da2dbb20d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:02 crc kubenswrapper[4954]: I1127 17:51:02.978609 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-g8nkp_5e3f28f3-6e95-438e-ba6e-587578b29bf9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.104246 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/init/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.269976 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/init/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.321458 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-7klpn_b4e436ab-fb96-4213-be44-d08f62fa30ef/dnsmasq-dns/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.326931 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p9x6k_59b766b5-12a6-4e9c-b627-3d7705a04afc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.937068 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1301cc13-44b9-4a6e-b82d-cbea335ebc9a/glance-log/0.log" Nov 27 17:51:03 crc kubenswrapper[4954]: I1127 17:51:03.981497 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1301cc13-44b9-4a6e-b82d-cbea335ebc9a/glance-httpd/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.171765 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07/glance-httpd/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.184002 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0c9c6c7d-30bd-4195-b8ea-2ef4aefebd07/glance-log/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.344921 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b5c6d8894-l7bzv_11ddebaa-610a-410a-a161-a5b89d87eb75/horizon/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.424932 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vhxrq_dbb8e909-5f3f-4076-b549-d489f37cd8e3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.705419 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5sdh6_d7832bff-0ac7-4654-8277-92b9d5c04aa0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.756809 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b5c6d8894-l7bzv_11ddebaa-610a-410a-a161-a5b89d87eb75/horizon-log/0.log" Nov 27 17:51:04 crc kubenswrapper[4954]: I1127 17:51:04.943853 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404381-h5mc2_fa450761-82d0-4005-aee7-bcb56c03a5fd/keystone-cron/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.005357 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-68775c76df-2ppbs_a541738e-915f-413b-9b84-d57553ebc170/keystone-api/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.011379 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1ba0b816-c965-4474-b923-73f572cdc1ab/kube-state-metrics/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.290231 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-flwcw_6d34dbe8-0864-4b92-bd50-5bdd57209a74/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.781416 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f6cfb75df-7gbdb_3c0fe668-ab8d-4bad-acdd-da6d230de548/neutron-api/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.825450 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f6cfb75df-7gbdb_3c0fe668-ab8d-4bad-acdd-da6d230de548/neutron-httpd/0.log" Nov 27 17:51:05 crc kubenswrapper[4954]: I1127 17:51:05.923552 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qj6dp_5ea501ba-5c0c-4392-a64b-695c832dbb89/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:06 crc kubenswrapper[4954]: I1127 17:51:06.530799 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6662243a-d2bd-4571-8e27-6b923a367942/nova-api-log/0.log" Nov 27 17:51:06 crc kubenswrapper[4954]: I1127 17:51:06.534500 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b181e3d6-4f0e-40f1-ac14-96bcbb17622a/nova-cell0-conductor-conductor/0.log" Nov 27 17:51:06 crc kubenswrapper[4954]: I1127 17:51:06.978594 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_025a86e8-034b-4eef-8f20-14141598f0b4/nova-cell1-conductor-conductor/0.log" Nov 27 17:51:06 crc kubenswrapper[4954]: I1127 17:51:06.987769 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6662243a-d2bd-4571-8e27-6b923a367942/nova-api-api/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.001454 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a8c31305-69a0-477a-958f-d91daa9fe501/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.273854 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hq64s_7ab77d00-245a-41d2-a223-1caff56f23da/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.328929 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_51632054-40fc-42a7-b633-e1e35143689f/nova-metadata-log/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.705990 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_99c33ed6-9c2c-4eb0-be67-68c19d5479a7/nova-scheduler-scheduler/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.818661 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/mysql-bootstrap/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.973503 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/mysql-bootstrap/0.log" Nov 27 17:51:07 crc kubenswrapper[4954]: I1127 17:51:07.977650 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3a5f0d2c-eb7d-4fd5-abea-147c8c5ec8ac/galera/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.229028 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/mysql-bootstrap/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.384268 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/galera/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.458059 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_591d8033-08c2-4048-b24e-34508babfbad/mysql-bootstrap/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.566337 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_871d6a1f-a817-45c5-a3f5-3f0e47ef9bf3/openstackclient/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.685345 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wnm29_49863f24-1603-49e2-835c-31ced01d9f7f/openstack-network-exporter/0.log" Nov 27 17:51:08 crc kubenswrapper[4954]: I1127 17:51:08.805498 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_51632054-40fc-42a7-b633-e1e35143689f/nova-metadata-metadata/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.101072 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server-init/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.271876 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server-init/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.338245 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovsdb-server/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.341189 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-btgpk_abad518f-43af-457b-add5-c0291513ad71/ovs-vswitchd/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.515263 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7sc8_2a98905f-a2dd-4eb2-9a4f-437eb3626871/ovn-controller/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.583791 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wzf94_3f2166e4-73a3-4c61-ae1b-2aeb55e4eddc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.735875 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d11a38a9-30c1-44d2-81ca-965f0dfbde96/openstack-network-exporter/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.770177 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d11a38a9-30c1-44d2-81ca-965f0dfbde96/ovn-northd/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.950604 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_585214bf-1a7b-426d-b1a6-d26e69e0116f/ovsdbserver-nb/0.log" Nov 27 17:51:09 crc kubenswrapper[4954]: I1127 17:51:09.971988 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_585214bf-1a7b-426d-b1a6-d26e69e0116f/openstack-network-exporter/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.144264 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f8f5ac1-9978-4d8a-b12d-f902e9cb316c/ovsdbserver-sb/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.245084 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f8f5ac1-9978-4d8a-b12d-f902e9cb316c/openstack-network-exporter/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.393528 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d8d4694bd-z9zk4_a4617263-6b9f-4f0c-af69-9d589143eb12/placement-api/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.450895 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d8d4694bd-z9zk4_a4617263-6b9f-4f0c-af69-9d589143eb12/placement-log/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.552754 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/setup-container/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.752368 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/setup-container/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.752650 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/setup-container/0.log" Nov 27 17:51:10 crc kubenswrapper[4954]: I1127 17:51:10.837721 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f048cd15-3583-44fd-a9ca-1288e89f29b3/rabbitmq/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.038165 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/setup-container/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.048871 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e3c0607-0f08-4188-9995-c0a2a253fdc5/rabbitmq/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.111111 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gpwcf_6e2def23-1765-4015-b698-c2b8516a6f18/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.301679 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qvnb8_39bece64-6033-4ca3-846d-6718f68f1f6d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.440257 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jwhw6_d294865e-7999-4e81-818f-3a5db24b7f01/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.512050 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6l884_cfb3cf23-1ad0-47ac-af59-8b8ae7e79678/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.669078 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xgjv5_694335d5-113f-4c2b-ab58-22fc7b866e46/ssh-known-hosts-edpm-deployment/0.log" Nov 27 17:51:11 crc kubenswrapper[4954]: I1127 17:51:11.892544 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85cf58799f-l72lc_9de053dc-d10c-4999-9019-f7221fb9e237/proxy-server/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.014271 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85cf58799f-l72lc_9de053dc-d10c-4999-9019-f7221fb9e237/proxy-httpd/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.021761 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cbrqw_794c6bdd-2ec7-458f-99ed-23383a740479/swift-ring-rebalance/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.176140 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-auditor/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.211047 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-reaper/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.283739 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-replicator/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.381198 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/account-server/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.400783 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-auditor/0.log" Nov 27 17:51:12 crc kubenswrapper[4954]: I1127 17:51:12.458552 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-replicator/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.069873 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-server/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.076061 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/container-updater/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.086757 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-expirer/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.088040 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-auditor/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.249324 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-server/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.310044 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-updater/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.345504 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/object-replicator/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.370186 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/rsync/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.531898 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cff965fb-87ef-40a5-9dff-7d10d74cc09c/swift-recon-cron/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.594890 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-r476v_200fb5dd-f5ad-4f82-8a9c-e8e378075448/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.807363 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ae22fda2-42ce-4b9e-9daf-00e886b8449b/tempest-tests-tempest-tests-runner/0.log" Nov 27 17:51:13 crc kubenswrapper[4954]: I1127 17:51:13.824087 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9305c47b-ee95-423d-b4dc-f8a5fbe9cd6c/test-operator-logs-container/0.log" Nov 27 17:51:14 crc kubenswrapper[4954]: I1127 17:51:14.042419 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b66qm_655b8641-7aaf-4f45-b8a0-b23fbbfa3abd/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:51:24 crc kubenswrapper[4954]: I1127 17:51:24.429810 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_808630a2-42dd-48c9-a004-749515cb771b/memcached/0.log" Nov 27 17:51:40 crc kubenswrapper[4954]: I1127 17:51:40.956257 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.167804 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.207697 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.225069 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.426917 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/util/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.427679 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/pull/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.434421 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b3267e84ac43849283543a2f97dd6a38e4585c72e47c859f911726b07krfj8_4f0a14af-754e-4601-aadc-77e1a310c088/extract/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.858908 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nz28b_50ec526e-d6db-45fa-8b99-bd795b4c3690/kube-rbac-proxy/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.860207 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-4rg5t_11ca1308-8c7a-4a3d-a283-2533abc54c25/kube-rbac-proxy/0.log" Nov 27 17:51:41 crc kubenswrapper[4954]: I1127 17:51:41.971043 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nz28b_50ec526e-d6db-45fa-8b99-bd795b4c3690/manager/0.log" Nov 27 17:51:42 crc kubenswrapper[4954]: I1127 17:51:42.077854 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-4rg5t_11ca1308-8c7a-4a3d-a283-2533abc54c25/manager/0.log" Nov 27 17:51:42 crc kubenswrapper[4954]: I1127 17:51:42.782334 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-dzjch_a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95/kube-rbac-proxy/0.log" Nov 27 17:51:42 crc kubenswrapper[4954]: I1127 17:51:42.889895 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-dzjch_a09ff3fd-b10f-421c-a3a5-aa7dc4dcff95/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.067033 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-8ghg2_7c8dd8cc-7be7-41f9-ac93-139dc9e83274/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.076049 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-8ghg2_7c8dd8cc-7be7-41f9-ac93-139dc9e83274/kube-rbac-proxy/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.293870 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zlt7m_5884eab6-e3c0-45de-b93d-73392533b780/kube-rbac-proxy/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.318621 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zlt7m_5884eab6-e3c0-45de-b93d-73392533b780/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.418450 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-nnj6l_c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a/kube-rbac-proxy/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.530525 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-nnj6l_c7c7b69c-1d63-4d4b-ac0b-ad2be204cf8a/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.579089 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4vpsc_736ef0f4-e471-4acd-8569-2a6d6d260f67/kube-rbac-proxy/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.703325 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-4vpsc_736ef0f4-e471-4acd-8569-2a6d6d260f67/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.742775 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-42dmk_9366c02a-e022-47e4-86c2-35d1e9a54cf4/kube-rbac-proxy/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.851684 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-42dmk_9366c02a-e022-47e4-86c2-35d1e9a54cf4/manager/0.log" Nov 27 17:51:43 crc kubenswrapper[4954]: I1127 17:51:43.900848 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-bw2j9_cc869191-7d3d-4192-bf48-a48625bff6ff/kube-rbac-proxy/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.092065 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-bw2j9_cc869191-7d3d-4192-bf48-a48625bff6ff/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.164091 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-2jpwm_6dbcc715-b375-4776-87ff-4c5ecad80975/kube-rbac-proxy/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.302887 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-2jpwm_6dbcc715-b375-4776-87ff-4c5ecad80975/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.384105 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5vqr2_ff3108ae-4629-448b-80d3-949e631c60d8/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.387002 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5vqr2_ff3108ae-4629-448b-80d3-949e631c60d8/kube-rbac-proxy/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.567805 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4g8kb_56f35029-dbcb-437a-94ed-3eac63c5145c/kube-rbac-proxy/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.700900 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4g8kb_56f35029-dbcb-437a-94ed-3eac63c5145c/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.716440 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-p55vw_b4fb4c16-8870-494e-a075-ee70d251da46/kube-rbac-proxy/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.877874 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-p55vw_b4fb4c16-8870-494e-a075-ee70d251da46/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.973237 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-9pwxb_770db406-d44c-490f-8409-f5b3e8f66145/manager/0.log" Nov 27 17:51:44 crc kubenswrapper[4954]: I1127 17:51:44.982033 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-9pwxb_770db406-d44c-490f-8409-f5b3e8f66145/kube-rbac-proxy/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.084074 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb_cbeef148-5a6f-4738-83f0-eae93d81bae3/kube-rbac-proxy/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.163978 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bqfwjb_cbeef148-5a6f-4738-83f0-eae93d81bae3/manager/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.458914 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xhxgn_e300d0a8-a678-4065-bbcd-a886791e9e1a/registry-server/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.621326 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-757f5977c4-9sxch_67df4fc6-9215-4441-955b-d7d740c5db1e/operator/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.768161 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-mln9c_7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0/kube-rbac-proxy/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.769847 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-mln9c_7dcd119b-9cb2-48ab-ac2f-2f0b10d5b2f0/manager/0.log" Nov 27 17:51:45 crc kubenswrapper[4954]: I1127 17:51:45.880873 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-nv8bz_73b53349-7e1d-499f-918e-e25598787e70/kube-rbac-proxy/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.023755 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-nv8bz_73b53349-7e1d-499f-918e-e25598787e70/manager/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.194765 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xvk89_8fad5f5d-c6a2-497f-8524-1ae501d6a444/operator/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.430469 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-556d4f4767-6wqxx_7eefae7c-fef6-47b3-8f89-4856b6ae1980/manager/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.504743 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-mmr72_376db7a5-650f-4327-8e03-2f2be98969a0/kube-rbac-proxy/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.614848 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wr8t4_523e3a36-bc9e-4698-af7d-e7ecd3b7a740/kube-rbac-proxy/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.640357 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-mmr72_376db7a5-650f-4327-8e03-2f2be98969a0/manager/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.816701 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wr8t4_523e3a36-bc9e-4698-af7d-e7ecd3b7a740/manager/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.881800 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7dmz6_146450d6-91cc-4600-9712-449fcf5328b2/manager/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.902838 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7dmz6_146450d6-91cc-4600-9712-449fcf5328b2/kube-rbac-proxy/0.log" Nov 27 17:51:46 crc kubenswrapper[4954]: I1127 17:51:46.979381 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-wg8x7_6a00b9f9-d61f-411d-897d-496d8c8b3501/kube-rbac-proxy/0.log" Nov 27 17:51:47 crc kubenswrapper[4954]: I1127 17:51:47.033133 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-wg8x7_6a00b9f9-d61f-411d-897d-496d8c8b3501/manager/0.log" Nov 27 17:52:05 crc kubenswrapper[4954]: I1127 17:52:05.633080 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zmv7j_4f13cd59-b0f9-4562-a20b-d3d8f4bca5bb/control-plane-machine-set-operator/0.log" Nov 27 17:52:05 crc kubenswrapper[4954]: I1127 17:52:05.787287 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h48pg_daf9759f-1f7d-4613-b734-a39f4552222e/kube-rbac-proxy/0.log" Nov 27 17:52:05 crc kubenswrapper[4954]: I1127 17:52:05.824594 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h48pg_daf9759f-1f7d-4613-b734-a39f4552222e/machine-api-operator/0.log" Nov 27 17:52:17 crc kubenswrapper[4954]: I1127 17:52:17.371090 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2t966_51fb16a6-3c9e-4cca-a603-8b71f0b91ee1/cert-manager-controller/0.log" Nov 27 17:52:17 crc kubenswrapper[4954]: I1127 17:52:17.500303 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-96dn8_f6f24261-8d7e-454f-8d20-2a35f12114c6/cert-manager-cainjector/0.log" Nov 27 17:52:17 crc kubenswrapper[4954]: I1127 17:52:17.562760 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ghjsr_04065317-2688-429e-8362-970a4f083d14/cert-manager-webhook/0.log" Nov 27 17:52:29 crc kubenswrapper[4954]: I1127 17:52:29.796442 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-g5fhs_88437c38-051a-4331-bfd9-1b5356e88818/nmstate-console-plugin/0.log" Nov 27 17:52:29 crc kubenswrapper[4954]: I1127 17:52:29.989460 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4dwz_80ecd4a6-6bf2-4533-ab69-a5a12b747d81/nmstate-handler/0.log" Nov 27 17:52:29 crc kubenswrapper[4954]: I1127 17:52:29.996169 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bkn4s_e9b96f60-bef6-430b-8f44-d5e602d140ee/kube-rbac-proxy/0.log" Nov 27 17:52:30 crc kubenswrapper[4954]: I1127 17:52:30.087193 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bkn4s_e9b96f60-bef6-430b-8f44-d5e602d140ee/nmstate-metrics/0.log" Nov 27 17:52:30 crc kubenswrapper[4954]: I1127 17:52:30.393449 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-qx7sx_742e2266-3aa1-4c59-958e-8200fea0b45c/nmstate-operator/0.log" Nov 27 17:52:30 crc kubenswrapper[4954]: I1127 17:52:30.506479 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-89lrn_391ad61e-fdf4-41bf-b3eb-a8950896debb/nmstate-webhook/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.284127 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-l6mbj_33df22a6-6a0f-445c-8a77-ad9cfb09d3d4/kube-rbac-proxy/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.305269 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-l6mbj_33df22a6-6a0f-445c-8a77-ad9cfb09d3d4/controller/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.421247 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.602640 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.604933 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.608212 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.612665 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.814037 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.815135 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.816140 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:52:45 crc kubenswrapper[4954]: I1127 17:52:45.856360 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.006613 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-frr-files/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.026749 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-reloader/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.064603 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/cp-metrics/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.067993 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/controller/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.229782 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/frr-metrics/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.274962 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/kube-rbac-proxy-frr/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.298154 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/kube-rbac-proxy/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.496178 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/reloader/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.575150 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-f6v24_6792e473-15c3-405b-8c32-007e421b40c6/frr-k8s-webhook-server/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.793866 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d9ff7464f-4f4jv_52ee24fe-968b-440d-8884-5772e253c8b4/manager/0.log" Nov 27 17:52:46 crc kubenswrapper[4954]: I1127 17:52:46.933705 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c64f4dc9b-sq87v_0b1812ac-de14-42bf-acbf-d6a68650bb93/webhook-server/0.log" Nov 27 17:52:47 crc kubenswrapper[4954]: I1127 17:52:47.067431 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ql5zn_008cad91-d45f-4942-9e82-239acf3fb8ed/kube-rbac-proxy/0.log" Nov 27 17:52:47 crc kubenswrapper[4954]: I1127 17:52:47.585009 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ql5zn_008cad91-d45f-4942-9e82-239acf3fb8ed/speaker/0.log" Nov 27 17:52:47 crc kubenswrapper[4954]: I1127 17:52:47.842754 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9psn7_7cde0cd2-0d4c-411e-b857-8488be2e2f0f/frr/0.log" Nov 27 17:52:53 crc kubenswrapper[4954]: I1127 17:52:53.687711 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:52:53 crc kubenswrapper[4954]: I1127 17:52:53.688634 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:53:00 crc kubenswrapper[4954]: I1127 17:53:00.856229 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.035079 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.435001 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.444841 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.662794 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/util/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.666977 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/extract/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.712699 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwh9gp_bc86b0e3-7ca2-40a1-b559-e74733db90f0/pull/0.log" Nov 27 17:53:01 crc kubenswrapper[4954]: I1127 17:53:01.888423 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.079750 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.090062 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.135127 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.296963 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/util/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.404197 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/extract/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.540121 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qtpvj_76711bd9-a588-4492-9d26-0d80376444db/pull/0.log" Nov 27 17:53:02 crc kubenswrapper[4954]: I1127 17:53:02.651623 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.216373 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.240038 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.246539 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.434329 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-utilities/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.503934 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/extract-content/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.750426 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.923325 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vxhf_a06676d3-037c-4529-926c-0624a5e647ee/registry-server/0.log" Nov 27 17:53:03 crc kubenswrapper[4954]: I1127 17:53:03.961328 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.033991 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.034169 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.258225 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-content/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.292350 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/extract-utilities/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.609489 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-txfqr_8215930a-947b-45d7-9c4e-9d867d3f234e/marketplace-operator/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.635820 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.831109 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.835999 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.846275 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xj2hb_56ec19b6-189a-4163-ae87-1c95809ad7d3/registry-server/0.log" Nov 27 17:53:04 crc kubenswrapper[4954]: I1127 17:53:04.910636 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.037183 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-utilities/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.075886 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/extract-content/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.165380 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.270037 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r5pxl_84522a03-6ce9-4c9d-b5ee-786ec39f6555/registry-server/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.394885 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.398953 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.437498 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.598192 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-content/0.log" Nov 27 17:53:05 crc kubenswrapper[4954]: I1127 17:53:05.608511 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/extract-utilities/0.log" Nov 27 17:53:06 crc kubenswrapper[4954]: I1127 17:53:06.151690 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2t8bp_7b582b55-6fc1-4a38-a30e-b192d35acdcc/registry-server/0.log" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.013244 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zq2cc"] Nov 27 17:53:10 crc kubenswrapper[4954]: E1127 17:53:10.014383 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f18514d-4c36-4a78-a05e-3fdf372733b0" containerName="container-00" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.014402 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f18514d-4c36-4a78-a05e-3fdf372733b0" containerName="container-00" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.016969 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f18514d-4c36-4a78-a05e-3fdf372733b0" containerName="container-00" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.018872 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.025928 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zq2cc"] Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.144504 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-utilities\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.144712 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqsvc\" (UniqueName: \"kubernetes.io/projected/ccf02b0b-042c-47be-b4d7-c412ea911235-kube-api-access-dqsvc\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.144939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-catalog-content\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.246292 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-catalog-content\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.246651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-utilities\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.246723 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsvc\" (UniqueName: \"kubernetes.io/projected/ccf02b0b-042c-47be-b4d7-c412ea911235-kube-api-access-dqsvc\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.246954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-catalog-content\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.246988 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf02b0b-042c-47be-b4d7-c412ea911235-utilities\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.281641 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsvc\" (UniqueName: \"kubernetes.io/projected/ccf02b0b-042c-47be-b4d7-c412ea911235-kube-api-access-dqsvc\") pod \"community-operators-zq2cc\" (UID: \"ccf02b0b-042c-47be-b4d7-c412ea911235\") " pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.346689 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:10 crc kubenswrapper[4954]: I1127 17:53:10.744373 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zq2cc"] Nov 27 17:53:11 crc kubenswrapper[4954]: I1127 17:53:11.985847 4954 generic.go:334] "Generic (PLEG): container finished" podID="ccf02b0b-042c-47be-b4d7-c412ea911235" containerID="e7728e91277773a9ec2a5167e954e3d5b2ed1be56cc4dc018ded9923d9d2b844" exitCode=0 Nov 27 17:53:11 crc kubenswrapper[4954]: I1127 17:53:11.985888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq2cc" event={"ID":"ccf02b0b-042c-47be-b4d7-c412ea911235","Type":"ContainerDied","Data":"e7728e91277773a9ec2a5167e954e3d5b2ed1be56cc4dc018ded9923d9d2b844"} Nov 27 17:53:11 crc kubenswrapper[4954]: I1127 17:53:11.986108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq2cc" event={"ID":"ccf02b0b-042c-47be-b4d7-c412ea911235","Type":"ContainerStarted","Data":"1f9bf905dd664f292179c8ab6a36eeb5beed39e2807d41384ef1f97a9878a4bf"} Nov 27 17:53:17 crc kubenswrapper[4954]: I1127 17:53:17.039284 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq2cc" event={"ID":"ccf02b0b-042c-47be-b4d7-c412ea911235","Type":"ContainerStarted","Data":"0ec3cbb13d5ab16c7d27f81eced06705c2420a21ff351b9240f276be322232ce"} Nov 27 17:53:18 crc kubenswrapper[4954]: I1127 17:53:18.050801 4954 generic.go:334] "Generic (PLEG): container finished" podID="ccf02b0b-042c-47be-b4d7-c412ea911235" containerID="0ec3cbb13d5ab16c7d27f81eced06705c2420a21ff351b9240f276be322232ce" exitCode=0 Nov 27 17:53:18 crc kubenswrapper[4954]: I1127 17:53:18.051012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq2cc" event={"ID":"ccf02b0b-042c-47be-b4d7-c412ea911235","Type":"ContainerDied","Data":"0ec3cbb13d5ab16c7d27f81eced06705c2420a21ff351b9240f276be322232ce"} Nov 27 17:53:19 crc kubenswrapper[4954]: I1127 17:53:19.062310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq2cc" event={"ID":"ccf02b0b-042c-47be-b4d7-c412ea911235","Type":"ContainerStarted","Data":"6a3168f9ba9f42b6ebe4f083d7646221ff1f2929a92b91d32b949cdcadcf89d1"} Nov 27 17:53:19 crc kubenswrapper[4954]: I1127 17:53:19.094768 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zq2cc" podStartSLOduration=3.5588934070000002 podStartE2EDuration="10.094746384s" podCreationTimestamp="2025-11-27 17:53:09 +0000 UTC" firstStartedPulling="2025-11-27 17:53:11.987982536 +0000 UTC m=+4504.005422836" lastFinishedPulling="2025-11-27 17:53:18.523835513 +0000 UTC m=+4510.541275813" observedRunningTime="2025-11-27 17:53:19.084428443 +0000 UTC m=+4511.101868743" watchObservedRunningTime="2025-11-27 17:53:19.094746384 +0000 UTC m=+4511.112186694" Nov 27 17:53:20 crc kubenswrapper[4954]: I1127 17:53:20.347270 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:20 crc kubenswrapper[4954]: I1127 17:53:20.347675 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:20 crc kubenswrapper[4954]: I1127 17:53:20.739175 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:23 crc kubenswrapper[4954]: I1127 17:53:23.687539 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:53:23 crc kubenswrapper[4954]: I1127 17:53:23.688148 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:53:30 crc kubenswrapper[4954]: I1127 17:53:30.420103 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zq2cc" Nov 27 17:53:30 crc kubenswrapper[4954]: I1127 17:53:30.519619 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zq2cc"] Nov 27 17:53:30 crc kubenswrapper[4954]: I1127 17:53:30.642523 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 17:53:30 crc kubenswrapper[4954]: I1127 17:53:30.655349 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xj2hb" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="registry-server" containerID="cri-o://c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1" gracePeriod=2 Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.790139 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.959325 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities\") pod \"56ec19b6-189a-4163-ae87-1c95809ad7d3\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.959424 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content\") pod \"56ec19b6-189a-4163-ae87-1c95809ad7d3\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.959521 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjtr\" (UniqueName: \"kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr\") pod \"56ec19b6-189a-4163-ae87-1c95809ad7d3\" (UID: \"56ec19b6-189a-4163-ae87-1c95809ad7d3\") " Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.965186 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities" (OuterVolumeSpecName: "utilities") pod "56ec19b6-189a-4163-ae87-1c95809ad7d3" (UID: "56ec19b6-189a-4163-ae87-1c95809ad7d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:53:31 crc kubenswrapper[4954]: I1127 17:53:31.983395 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr" (OuterVolumeSpecName: "kube-api-access-cbjtr") pod "56ec19b6-189a-4163-ae87-1c95809ad7d3" (UID: "56ec19b6-189a-4163-ae87-1c95809ad7d3"). InnerVolumeSpecName "kube-api-access-cbjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.062293 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.062327 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjtr\" (UniqueName: \"kubernetes.io/projected/56ec19b6-189a-4163-ae87-1c95809ad7d3-kube-api-access-cbjtr\") on node \"crc\" DevicePath \"\"" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.088920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56ec19b6-189a-4163-ae87-1c95809ad7d3" (UID: "56ec19b6-189a-4163-ae87-1c95809ad7d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.163889 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ec19b6-189a-4163-ae87-1c95809ad7d3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.230881 4954 generic.go:334] "Generic (PLEG): container finished" podID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerID="c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1" exitCode=0 Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.230974 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerDied","Data":"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1"} Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.231002 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj2hb" event={"ID":"56ec19b6-189a-4163-ae87-1c95809ad7d3","Type":"ContainerDied","Data":"67ef586721f3fcbe720b4f69bfb490c1e575563df749263db37a92b49fdbb956"} Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.231024 4954 scope.go:117] "RemoveContainer" containerID="c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.231247 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj2hb" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.289703 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.318354 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xj2hb"] Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.321266 4954 scope.go:117] "RemoveContainer" containerID="ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.381755 4954 scope.go:117] "RemoveContainer" containerID="7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.444001 4954 scope.go:117] "RemoveContainer" containerID="c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1" Nov 27 17:53:32 crc kubenswrapper[4954]: E1127 17:53:32.444733 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1\": container with ID starting with c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1 not found: ID does not exist" containerID="c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.444776 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1"} err="failed to get container status \"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1\": rpc error: code = NotFound desc = could not find container \"c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1\": container with ID starting with c2651cd2cce9aed481210f3565ecf544e62aaea8a53ee8d26ff30f169886bee1 not found: ID does not exist" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.444801 4954 scope.go:117] "RemoveContainer" containerID="ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4" Nov 27 17:53:32 crc kubenswrapper[4954]: E1127 17:53:32.445566 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4\": container with ID starting with ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4 not found: ID does not exist" containerID="ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.445611 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4"} err="failed to get container status \"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4\": rpc error: code = NotFound desc = could not find container \"ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4\": container with ID starting with ec55bdae7e24215c1811cc1db815c1bab8aa0622c649acd4b28a6dfc908e21c4 not found: ID does not exist" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.445631 4954 scope.go:117] "RemoveContainer" containerID="7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1" Nov 27 17:53:32 crc kubenswrapper[4954]: E1127 17:53:32.447196 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1\": container with ID starting with 7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1 not found: ID does not exist" containerID="7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.447232 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1"} err="failed to get container status \"7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1\": rpc error: code = NotFound desc = could not find container \"7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1\": container with ID starting with 7a28ae4bb973b00028fc575332194baab40674c23155de8b86cacec236ff5fc1 not found: ID does not exist" Nov 27 17:53:32 crc kubenswrapper[4954]: I1127 17:53:32.674029 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" path="/var/lib/kubelet/pods/56ec19b6-189a-4163-ae87-1c95809ad7d3/volumes" Nov 27 17:53:44 crc kubenswrapper[4954]: I1127 17:53:44.442033 4954 patch_prober.go:28] interesting pod/console-7c589d8dc4-s99r2 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 17:53:44 crc kubenswrapper[4954]: I1127 17:53:44.442687 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-7c589d8dc4-s99r2" podUID="eb4befa9-9398-4e32-835e-3e4d5b363d5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.44:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 17:53:53 crc kubenswrapper[4954]: I1127 17:53:53.687128 4954 patch_prober.go:28] interesting pod/machine-config-daemon-699qq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:53:53 crc kubenswrapper[4954]: I1127 17:53:53.687916 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:53:53 crc kubenswrapper[4954]: I1127 17:53:53.687976 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-699qq" Nov 27 17:53:53 crc kubenswrapper[4954]: I1127 17:53:53.688810 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24"} pod="openshift-machine-config-operator/machine-config-daemon-699qq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:53:53 crc kubenswrapper[4954]: I1127 17:53:53.689068 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerName="machine-config-daemon" containerID="cri-o://92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" gracePeriod=600 Nov 27 17:53:53 crc kubenswrapper[4954]: E1127 17:53:53.741812 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a80574_7c60_4f19_985b_3ee313cb7bcd.slice/crio-92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:53:53 crc kubenswrapper[4954]: E1127 17:53:53.827426 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:53:54 crc kubenswrapper[4954]: I1127 17:53:54.417114 4954 generic.go:334] "Generic (PLEG): container finished" podID="33a80574-7c60-4f19-985b-3ee313cb7bcd" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" exitCode=0 Nov 27 17:53:54 crc kubenswrapper[4954]: I1127 17:53:54.417169 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerDied","Data":"92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24"} Nov 27 17:53:54 crc kubenswrapper[4954]: I1127 17:53:54.417206 4954 scope.go:117] "RemoveContainer" containerID="a3c1b4c7a1565f160e5b62bddd964a7c2407cb7c03f79c69bde2c49cf255237d" Nov 27 17:53:54 crc kubenswrapper[4954]: I1127 17:53:54.417967 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:53:54 crc kubenswrapper[4954]: E1127 17:53:54.418394 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:54:08 crc kubenswrapper[4954]: I1127 17:54:08.670703 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:54:08 crc kubenswrapper[4954]: E1127 17:54:08.672169 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.828904 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:18 crc kubenswrapper[4954]: E1127 17:54:18.829985 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="extract-utilities" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.830001 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="extract-utilities" Nov 27 17:54:18 crc kubenswrapper[4954]: E1127 17:54:18.830038 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="registry-server" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.830044 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="registry-server" Nov 27 17:54:18 crc kubenswrapper[4954]: E1127 17:54:18.830066 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="extract-content" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.830074 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="extract-content" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.830268 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ec19b6-189a-4163-ae87-1c95809ad7d3" containerName="registry-server" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.831900 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.848156 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.886359 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.886403 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8pl\" (UniqueName: \"kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.886923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.988601 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.988675 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.988700 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8pl\" (UniqueName: \"kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.989165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:18 crc kubenswrapper[4954]: I1127 17:54:18.989240 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:19 crc kubenswrapper[4954]: I1127 17:54:19.010254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8pl\" (UniqueName: \"kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl\") pod \"certified-operators-9cv6k\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:19 crc kubenswrapper[4954]: I1127 17:54:19.155996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:19 crc kubenswrapper[4954]: I1127 17:54:19.642136 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:19 crc kubenswrapper[4954]: I1127 17:54:19.644267 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerStarted","Data":"2d4e77c270fc8d5533dd757bfcc324599177264d9612388945b1f5ade5a4453c"} Nov 27 17:54:20 crc kubenswrapper[4954]: I1127 17:54:20.654022 4954 generic.go:334] "Generic (PLEG): container finished" podID="1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" containerID="3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54" exitCode=0 Nov 27 17:54:20 crc kubenswrapper[4954]: I1127 17:54:20.654393 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerDied","Data":"3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54"} Nov 27 17:54:22 crc kubenswrapper[4954]: I1127 17:54:22.676540 4954 generic.go:334] "Generic (PLEG): container finished" podID="1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" containerID="153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2" exitCode=0 Nov 27 17:54:22 crc kubenswrapper[4954]: I1127 17:54:22.676624 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerDied","Data":"153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2"} Nov 27 17:54:23 crc kubenswrapper[4954]: I1127 17:54:23.662656 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:54:23 crc kubenswrapper[4954]: E1127 17:54:23.663229 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:54:23 crc kubenswrapper[4954]: I1127 17:54:23.687015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerStarted","Data":"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9"} Nov 27 17:54:23 crc kubenswrapper[4954]: I1127 17:54:23.707157 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9cv6k" podStartSLOduration=3.052953704 podStartE2EDuration="5.707138832s" podCreationTimestamp="2025-11-27 17:54:18 +0000 UTC" firstStartedPulling="2025-11-27 17:54:20.656117052 +0000 UTC m=+4572.673557342" lastFinishedPulling="2025-11-27 17:54:23.31030217 +0000 UTC m=+4575.327742470" observedRunningTime="2025-11-27 17:54:23.701940195 +0000 UTC m=+4575.719380505" watchObservedRunningTime="2025-11-27 17:54:23.707138832 +0000 UTC m=+4575.724579142" Nov 27 17:54:29 crc kubenswrapper[4954]: I1127 17:54:29.156520 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:29 crc kubenswrapper[4954]: I1127 17:54:29.157105 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:29 crc kubenswrapper[4954]: I1127 17:54:29.243627 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:29 crc kubenswrapper[4954]: I1127 17:54:29.786627 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:29 crc kubenswrapper[4954]: I1127 17:54:29.841866 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:31 crc kubenswrapper[4954]: I1127 17:54:31.755787 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9cv6k" podUID="1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" containerName="registry-server" containerID="cri-o://f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9" gracePeriod=2 Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.206571 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.263459 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content\") pod \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.263535 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8pl\" (UniqueName: \"kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl\") pod \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.263677 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities\") pod \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\" (UID: \"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee\") " Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.264415 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities" (OuterVolumeSpecName: "utilities") pod "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" (UID: "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.271793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl" (OuterVolumeSpecName: "kube-api-access-hx8pl") pod "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" (UID: "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee"). InnerVolumeSpecName "kube-api-access-hx8pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.365456 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.365493 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8pl\" (UniqueName: \"kubernetes.io/projected/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-kube-api-access-hx8pl\") on node \"crc\" DevicePath \"\"" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.650505 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" (UID: "1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.670418 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.769612 4954 generic.go:334] "Generic (PLEG): container finished" podID="1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" containerID="f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9" exitCode=0 Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.769680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerDied","Data":"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9"} Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.769736 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cv6k" event={"ID":"1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee","Type":"ContainerDied","Data":"2d4e77c270fc8d5533dd757bfcc324599177264d9612388945b1f5ade5a4453c"} Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.769759 4954 scope.go:117] "RemoveContainer" containerID="f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.769962 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cv6k" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.796874 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.801561 4954 scope.go:117] "RemoveContainer" containerID="153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.806109 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9cv6k"] Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.844741 4954 scope.go:117] "RemoveContainer" containerID="3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.878289 4954 scope.go:117] "RemoveContainer" containerID="f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9" Nov 27 17:54:32 crc kubenswrapper[4954]: E1127 17:54:32.881303 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9\": container with ID starting with f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9 not found: ID does not exist" containerID="f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.881427 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9"} err="failed to get container status \"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9\": rpc error: code = NotFound desc = could not find container \"f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9\": container with ID starting with f7f48037fbc38bde9941c1af50d345308799022e8c7240dac343fd2c3ca546f9 not found: ID does not exist" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.881468 4954 scope.go:117] "RemoveContainer" containerID="153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2" Nov 27 17:54:32 crc kubenswrapper[4954]: E1127 17:54:32.881983 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2\": container with ID starting with 153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2 not found: ID does not exist" containerID="153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.882043 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2"} err="failed to get container status \"153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2\": rpc error: code = NotFound desc = could not find container \"153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2\": container with ID starting with 153bbf338b0f6d89b7784c9f4b67c6bcbbe8fb71adc4d0735f11038185ea4cf2 not found: ID does not exist" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.882085 4954 scope.go:117] "RemoveContainer" containerID="3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54" Nov 27 17:54:32 crc kubenswrapper[4954]: E1127 17:54:32.882769 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54\": container with ID starting with 3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54 not found: ID does not exist" containerID="3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54" Nov 27 17:54:32 crc kubenswrapper[4954]: I1127 17:54:32.882892 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54"} err="failed to get container status \"3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54\": rpc error: code = NotFound desc = could not find container \"3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54\": container with ID starting with 3870fe7163e8468dc79f0f111eea2abcb7e80d1b0934c1a8e707b960a6b69b54 not found: ID does not exist" Nov 27 17:54:34 crc kubenswrapper[4954]: I1127 17:54:34.672349 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee" path="/var/lib/kubelet/pods/1a9c7bdc-7832-4694-b1f1-25fc2ea8a6ee/volumes" Nov 27 17:54:38 crc kubenswrapper[4954]: I1127 17:54:38.667875 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:54:38 crc kubenswrapper[4954]: E1127 17:54:38.668499 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:54:52 crc kubenswrapper[4954]: I1127 17:54:52.964712 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d784d93-57ea-4848-bf68-21934d1855e2" containerID="f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca" exitCode=0 Nov 27 17:54:52 crc kubenswrapper[4954]: I1127 17:54:52.964992 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" event={"ID":"3d784d93-57ea-4848-bf68-21934d1855e2","Type":"ContainerDied","Data":"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca"} Nov 27 17:54:52 crc kubenswrapper[4954]: I1127 17:54:52.966523 4954 scope.go:117] "RemoveContainer" containerID="f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca" Nov 27 17:54:53 crc kubenswrapper[4954]: I1127 17:54:53.662786 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:54:53 crc kubenswrapper[4954]: E1127 17:54:53.663798 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:54:53 crc kubenswrapper[4954]: I1127 17:54:53.880024 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjkqm_must-gather-mlwx4_3d784d93-57ea-4848-bf68-21934d1855e2/gather/0.log" Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.247275 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjkqm/must-gather-mlwx4"] Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.248005 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" podUID="3d784d93-57ea-4848-bf68-21934d1855e2" containerName="copy" containerID="cri-o://ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833" gracePeriod=2 Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.256773 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjkqm/must-gather-mlwx4"] Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.662376 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:55:04 crc kubenswrapper[4954]: E1127 17:55:04.663109 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.707270 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjkqm_must-gather-mlwx4_3d784d93-57ea-4848-bf68-21934d1855e2/copy/0.log" Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.707821 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.850445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48d6h\" (UniqueName: \"kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h\") pod \"3d784d93-57ea-4848-bf68-21934d1855e2\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.850561 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output\") pod \"3d784d93-57ea-4848-bf68-21934d1855e2\" (UID: \"3d784d93-57ea-4848-bf68-21934d1855e2\") " Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.858883 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h" (OuterVolumeSpecName: "kube-api-access-48d6h") pod "3d784d93-57ea-4848-bf68-21934d1855e2" (UID: "3d784d93-57ea-4848-bf68-21934d1855e2"). InnerVolumeSpecName "kube-api-access-48d6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:55:04 crc kubenswrapper[4954]: I1127 17:55:04.953247 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48d6h\" (UniqueName: \"kubernetes.io/projected/3d784d93-57ea-4848-bf68-21934d1855e2-kube-api-access-48d6h\") on node \"crc\" DevicePath \"\"" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.012066 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3d784d93-57ea-4848-bf68-21934d1855e2" (UID: "3d784d93-57ea-4848-bf68-21934d1855e2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.055305 4954 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d784d93-57ea-4848-bf68-21934d1855e2-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.068406 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjkqm_must-gather-mlwx4_3d784d93-57ea-4848-bf68-21934d1855e2/copy/0.log" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.068856 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d784d93-57ea-4848-bf68-21934d1855e2" containerID="ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833" exitCode=143 Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.068917 4954 scope.go:117] "RemoveContainer" containerID="ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.068973 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjkqm/must-gather-mlwx4" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.087058 4954 scope.go:117] "RemoveContainer" containerID="f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.155998 4954 scope.go:117] "RemoveContainer" containerID="ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833" Nov 27 17:55:05 crc kubenswrapper[4954]: E1127 17:55:05.156472 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833\": container with ID starting with ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833 not found: ID does not exist" containerID="ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.156515 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833"} err="failed to get container status \"ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833\": rpc error: code = NotFound desc = could not find container \"ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833\": container with ID starting with ca8c5d1912c2b2801e4a70e1d18c376f86d930a7f6b269202ccfb75cce8b3833 not found: ID does not exist" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.156543 4954 scope.go:117] "RemoveContainer" containerID="f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca" Nov 27 17:55:05 crc kubenswrapper[4954]: E1127 17:55:05.156898 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca\": container with ID starting with f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca not found: ID does not exist" containerID="f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca" Nov 27 17:55:05 crc kubenswrapper[4954]: I1127 17:55:05.156935 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca"} err="failed to get container status \"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca\": rpc error: code = NotFound desc = could not find container \"f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca\": container with ID starting with f28f0a56043870dbddfe531ab9c5e1ec20b15a66487cbfba825e14017077f5ca not found: ID does not exist" Nov 27 17:55:06 crc kubenswrapper[4954]: I1127 17:55:06.681290 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d784d93-57ea-4848-bf68-21934d1855e2" path="/var/lib/kubelet/pods/3d784d93-57ea-4848-bf68-21934d1855e2/volumes" Nov 27 17:55:15 crc kubenswrapper[4954]: I1127 17:55:15.661688 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:55:15 crc kubenswrapper[4954]: E1127 17:55:15.662716 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:55:27 crc kubenswrapper[4954]: I1127 17:55:27.661946 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:55:27 crc kubenswrapper[4954]: E1127 17:55:27.662678 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:55:42 crc kubenswrapper[4954]: I1127 17:55:42.664776 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:55:42 crc kubenswrapper[4954]: E1127 17:55:42.666752 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:55:53 crc kubenswrapper[4954]: I1127 17:55:53.662037 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:55:53 crc kubenswrapper[4954]: E1127 17:55:53.662864 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:56:04 crc kubenswrapper[4954]: I1127 17:56:04.661786 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:56:04 crc kubenswrapper[4954]: E1127 17:56:04.662648 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:56:16 crc kubenswrapper[4954]: I1127 17:56:16.672559 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:56:16 crc kubenswrapper[4954]: E1127 17:56:16.673266 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:56:29 crc kubenswrapper[4954]: I1127 17:56:29.662318 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:56:29 crc kubenswrapper[4954]: E1127 17:56:29.663210 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:56:43 crc kubenswrapper[4954]: I1127 17:56:43.662654 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:56:43 crc kubenswrapper[4954]: E1127 17:56:43.663487 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:56:48 crc kubenswrapper[4954]: I1127 17:56:48.020847 4954 scope.go:117] "RemoveContainer" containerID="958265b08133abe02e6bd2a60612d61c0b82a6b8ac66f768e7aa709909288aa7" Nov 27 17:56:48 crc kubenswrapper[4954]: I1127 17:56:48.052275 4954 scope.go:117] "RemoveContainer" containerID="8f425e11e2149a04a3799f7be308fdbbaf99146fd514d5a4268daeb78520f87b" Nov 27 17:56:55 crc kubenswrapper[4954]: I1127 17:56:55.663073 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:56:55 crc kubenswrapper[4954]: E1127 17:56:55.663929 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:57:08 crc kubenswrapper[4954]: I1127 17:57:08.673260 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:57:08 crc kubenswrapper[4954]: E1127 17:57:08.675285 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:57:21 crc kubenswrapper[4954]: I1127 17:57:21.662695 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:57:21 crc kubenswrapper[4954]: E1127 17:57:21.663550 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:57:34 crc kubenswrapper[4954]: I1127 17:57:34.664133 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:57:34 crc kubenswrapper[4954]: E1127 17:57:34.666928 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:57:49 crc kubenswrapper[4954]: I1127 17:57:49.662374 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:57:49 crc kubenswrapper[4954]: E1127 17:57:49.664021 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:58:04 crc kubenswrapper[4954]: I1127 17:58:04.666942 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:58:04 crc kubenswrapper[4954]: E1127 17:58:04.667722 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:58:15 crc kubenswrapper[4954]: I1127 17:58:15.661804 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:58:15 crc kubenswrapper[4954]: E1127 17:58:15.662495 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:58:26 crc kubenswrapper[4954]: I1127 17:58:26.662258 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:58:26 crc kubenswrapper[4954]: E1127 17:58:26.663082 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:58:38 crc kubenswrapper[4954]: I1127 17:58:38.669218 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:58:38 crc kubenswrapper[4954]: E1127 17:58:38.670064 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:58:49 crc kubenswrapper[4954]: I1127 17:58:49.662413 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:58:49 crc kubenswrapper[4954]: E1127 17:58:49.663241 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-699qq_openshift-machine-config-operator(33a80574-7c60-4f19-985b-3ee313cb7bcd)\"" pod="openshift-machine-config-operator/machine-config-daemon-699qq" podUID="33a80574-7c60-4f19-985b-3ee313cb7bcd" Nov 27 17:59:03 crc kubenswrapper[4954]: I1127 17:59:03.662327 4954 scope.go:117] "RemoveContainer" containerID="92acc28daa773c5e3456bee2d6f3e6b59e9180355e7fff0a925eff96ab528f24" Nov 27 17:59:03 crc kubenswrapper[4954]: I1127 17:59:03.997202 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-699qq" event={"ID":"33a80574-7c60-4f19-985b-3ee313cb7bcd","Type":"ContainerStarted","Data":"2c6bb682cdbdaa517b55ad52eddd3ab9c988fbbbc5ef3bad641ed01fda4c9863"}